Running FastAI book locally on Linux with Virtualenv

Finally got an NVidia 3060 and a new machine that I built now that prices for GPUs are more reasonable. I am running Manjaro on this new machine and one of the reasons for putting it together was to continue working on the FastAI book. Now the recommended approach by the authors is to use cloud GPU providers (Google Collab, Paperspace etc.) which is absolutely the right recommendation for most people. But I also think it is important to have working instructions for those who want a local setup. Here are the instructions that work circa October 2022 for Manjaro Linux + NVidia 3060 w/ Proprietary NVidia drivers:

Create the virtual environment, activate it and change to the fastai directory:

virtualenv fastai
source fastai/bin/activate
cd fastai

Install pytorch – instead of relying on dependency based installation of pytorch, it is best to install pytorch from the pytorch website using the right incantation for your chosen environment. You should go to the Pytorch install locally page, choose your specific configuration and use the command provided in your fastai environment to install pytorch. Here’s the command for installing pytorch using pip with support for CUDA 11.6:

pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu116

Install fastbook and dependencies

pip install fastbook

Install Jupyter notebook and dependencies, Jupyter notebook extensions, IPyWidgets

pip install notebook
pip install jupyter_contrib_nbextensions
pip install ipywidgets

Clone the Fastbook git repo

git clone https://github.com/fastai/fastbook

Change to the Fastbook folder and start your local Jupyter notebook server

cd fastbook
jupyter notebook

You should be all set and ready to learn how to use fastai + pytorch to build deep learning models in your local setup!

1 Reply to “Running FastAI book locally on Linux with Virtualenv”

Leave a Reply

Your email address will not be published. Required fields are marked *