Skip to content

Python SDK Setup 🐍#

Focoos models support multiple inference runtimes. To keep the library lightweight and to allow users to use their environment, optional dependencies (e.g., torch, onnxruntime, tensorrt) are not installed by default.

Focoos is shipped with the following local inference runtimes that requires to install additional dependencies. If you intend to use only Focoos AI servers for inference, you don't need to install any of the following dependencies.

RuntimeType Extra Runtime Compatible Devices Available ExecutionProvider
ONNX_CUDA32 [cuda] onnxruntime CUDA NVIDIA GPUs CUDAExecutionProvider
ONNX_TRT32 [tensorrt] onnxruntime TRT NVIDIA GPUs (Optimized) CUDAExecutionProvider, TensorrtExecutionProvider
ONNX_TRT16 [tensorrt] onnxruntime TRT NVIDIA GPUs (Optimized) CUDAExecutionProvider, TensorrtExecutionProvider
ONNX_CPU [cpu] onnxruntime CPU CPU (x86, ARM), M1, M2, M3 (Apple Silicon) CPUExecutionProvider, CoreMLExecutionProvider, AzureExecutionProvider
ONNX_COREML [cpu] onnxruntime CPU M1, M2, M3 (Apple Silicon) CoreMLExecutionProvider, CPUExecutionProvider
TORCHSCRIPT_32 [torch] torchscript CPU, NVIDIA GPUs -

Install the Focoos SDK#

The Focoos SDK can be installed with different package managers using python 3.10 and above.

We recommend using UV (how to install uv) as a package manager and environment manager for a streamlined dependency management experience.

You can easily create a new virtual environment with UV using the following command:

uv venv --python 3.12
source .venv/bin/activate

uv pip install 'focoos @ git+https://github.com/FocoosAI/focoos.git'
uv pip install 'focoos[cpu] @ git+https://github.com/FocoosAI/focoos.git'

To run the models using the torchscript runtime, you need to install the torch package.

uv pip install 'focoos[torch] @ git+https://github.com/FocoosAI/focoos.git'

Additional requirements: Ensure that you have CUDA 12 and cuDNN 9 installed, as they are required for onnxruntime version 1.20.1. To install cuDNN 9:

apt-get -y install cudnn9-cuda-12

uv pip install 'focoos[cuda] @ git+https://github.com/FocoosAI/focoos.git'

Additional requirements: Ensure that you have CUDA 12 and cuDNN 9 installed, as they are required for onnxruntime version 1.20.1. To install cuDNN 9:

apt-get -y install cudnn9-cuda-12
To perform inference using TensorRT, ensure you have TensorRT version 10.5 installed.
uv pip install 'focoos[tensorrt] @ git+https://github.com/FocoosAI/focoos.git'

Create and activate a new virtual environment using pip with the following commands:

python -m venv .venv
source .venv/bin/activate

pip install 'focoos @ git+https://github.com/FocoosAI/focoos.git'
pip install 'focoos[cpu] @ git+https://github.com/FocoosAI/focoos.git'
pip install 'focoos[torch] @ git+https://github.com/FocoosAI/focoos.git'

Additional requirements: Ensure that you have CUDA 12 and cuDNN 9 installed, as they are required for onnxruntime version 1.20.1. To install cuDNN 9:

apt-get -y install cudnn9-cuda-12
pip install 'focoos[cuda] @ git+https://github.com/FocoosAI/focoos.git'

Additional requirements: Ensure that you have CUDA 12 and cuDNN 9 installed, as they are required for onnxruntime version 1.20.1. To install cuDNN 9:

apt-get -y install cudnn9-cuda-12
To perform inference using TensorRT, ensure you have TensorRT version 10.5 installed.
pip install 'focoos[tensorrt] @ git+https://github.com/FocoosAI/focoos.git'

Create and activate a new conda (how to install conda) environment with Python 3.10 or higher:

conda create -n focoos python=3.12
conda activate focoos
conda install pip

pip install 'focoos @ git+https://github.com/FocoosAI/focoos.git'
pip install 'focoos[cpu] @ git+https://github.com/FocoosAI/focoos.git'
pip install 'focoos[torch] @ git+https://github.com/FocoosAI/focoos.git'

Additional requirements: Ensure that you have CUDA 12 and cuDNN 9 installed, as they are required for onnxruntime version 1.20.1. To install cuDNN 9:

apt-get -y install cudnn9-cuda-12
pip install 'focoos[cuda] @ git+https://github.com/FocoosAI/focoos.git'

Additional requirements: Ensure that you have CUDA 12 and cuDNN 9 installed, as they are required for onnxruntime version 1.20.1. To install cuDNN 9:

apt-get -y install cudnn9-cuda-12
To perform inference using TensorRT, ensure you have TensorRT version 10.5 installed.
pip install 'focoos[tensorrt] @ git+https://github.com/FocoosAI/focoos.git'

Note

🤖 Multiple Runtimes: You can install multiple extras by running pip install .[torch,cuda,tensorrt]. Anyway you can't use cpu and cuda or tensorrt at the same time.

Note

🛠️ Installation Tip: If you want to install a specific version, for example v0.1.3, use:

pip install 'focoos @ git+https://github.com/FocoosAI/focoos.git@v0.1.3'
📋 Check Versions: Visit https://github.com/FocoosAI/focoos/tags for available versions.

Docker and Devcontainers#

For container support, Focoos offers four different Docker images:

  • focoos-cpu: only CPU
  • focoos-cuda: Includes ONNX (CUDA) support
  • focoos-torch: Includes ONNX and Torchscript (CUDA) support
  • focoos-tensorrt: Includes ONNX, Torchscript, and TensorRT support

to use the docker images, you can run the following command:

docker run -it . --target=focoos-cpu

This repository also includes a devcontainer configuration for each of the above images. You can launch these devcontainers in Visual Studio Code for a seamless development experience.