Installations
-
Clone the project
$ git clone https://github.com/${YOUR_GITHUB_USERNAME}/${YOUR_REPOSITORY_NAME}.git
-
Adjust the
workspace/Dockerfile
based on CUDA Toolkit Version:-
Change the CUDA version of the base image to your CUDA Toolkit Version.
FROM nvidia/cuda:{YOUR-CUDA-TOOLKIT-VERSION}-base-ubuntu22.04 as base (other settings...)
See more official CUDA docker images here to find valid CUDA version images.
-
-
Start up the project
$ docker compose up --build -d
This might take a few minutes.
-
Connect to the docker container.
-
Adjust Pytorch source in
workspace/pyproject.toml
.[tool.poetry.dependencies] torch={version="2.1.2+cu121", source="torch"} # Edit here [[tool.poetry.source]] name = "torch" url = "https://download.pytorch.org/whl/cu121" # Edit here priority = "supplemental"
After editting the config, run:
$ poetry lock --no-update $ poetry install --no-root --sync
-
Run command below to check if the installation is successful.
-
In the container, open terminal-1 and run:
# Print out GPU information every 3 seconds $ nvidia-smi -l 3
-
In the terminal-2, run:
$ cd project_name $ poetry run python main.py
You should see the process in the terminal-1 and gpu GPU-Util more than 0%.
-