Effortlessly orchestrate free & open-source AI services from the command-line.
Here's a few important features:
- Simple command-line interface
- NVIDIA GPU support (with multiple CUDA versions available)
- Highly configurable through
tomlconfigs - Easily download ComfyUI & Ollama models from the CLI
- Config-driven ComfyUI custom nodes (git or local path) with auto-installed requirements
- Supports
Docker&Podman
The following services are currently supported:
- Ollama - Language & Vision models for text, coding, tolling and more.
- Llama - Lightweight C/C++ LLM inference with GGUF model support.
- Open WebUI - Feature-packed front-end chat with multi-model support.
- ComfyUI - Mature node-based UI for image generation workflows.
airpods start # Runs all available services
airpods start ollama open-webui # Run specific services
airpods status # Get the status of every running service
airpods logs ollama # View ollama's latest logs
airpods stop # Stop all running services
airpods stop comfyui # Only stop ComfyUI (if running)Run airpods --help for all available commands and options.
uv: Install, upgrade & manageairpods- Container Runtime (one of the following):
podman+podman-compose(recommended)docker+docker-compose
Note
AirPods automatically detects which container runtime is available. If both are installed, Podman is preferred by default. You can explicitly choose a runtime by setting runtime.prefer in your config file.
Install airpods with uv from Releases or:
# This is the nighly installation.
uv tool install "git+https://github.com/radicazz/airpods.git@main"Important
Upgrade your installation with uv tool upgrade airpods when a new version is available.
Check out LICENSE for more details.