A Nextcloud External Application (ExApp) that integrates Open WebUI - a feature-rich chat interface for LLMs - directly into Nextcloud.
- Beautiful Chat Interface - Modern, intuitive UI for interacting with AI
- Multiple Model Support - Connect to Ollama, OpenAI, or any compatible API
- Conversation History - Persistent chat history with search
- Document Upload - RAG capabilities with file uploads
- Multi-user Support - Each Nextcloud user gets their own workspace
- Mobile Friendly - Responsive design works on all devices
- Nextcloud 30 or higher
- AppAPI installed and configured
- Docker with a configured Deploy Daemon (HaRP recommended)
- Ollama ExApp (recommended) or external LLM API
- Install and enable the AppAPI app in Nextcloud
- Configure a Deploy Daemon
- Install the Ollama ExApp first (recommended)
- Search for "Open WebUI" in the External Apps section
- Click Install
# Register the ExApp with Nextcloud
occ app_api:app:register \
open_webui \
<your-daemon-name> \
--info-xml https://raw.githubusercontent.com/ConductionNL/open-webui-nextcloud/main/appinfo/info.xml \
--force-scopes| Environment Variable | Description | Default |
|---|---|---|
OLLAMA_BASE_URL |
URL to Ollama API | Auto-detect from Ollama ExApp |
OPENAI_API_BASE_URL |
OpenAI-compatible API URL | Not set |
OPENAI_API_KEY |
API key for OpenAI endpoint | Not set |
ENABLE_SIGNUP |
Allow user registration | false |
After installation, access Open WebUI through Nextcloud:
https://your-nextcloud/index.php/apps/app_api/proxy/open_webui
Or use the External Apps section in Nextcloud's admin panel.
- Open the WebUI through Nextcloud
- Create an admin account (first user becomes admin)
- Configure your LLM backend in Settings
- Start chatting!
If you have the Ollama ExApp installed, Open WebUI will automatically detect and connect to it. No manual configuration needed.
To use OpenAI or other compatible APIs:
- Go to Settings in Open WebUI
- Add your API endpoint and key
- Select your preferred model
docker build -t open-webui-exapp:dev .docker run -it --rm \
-e APP_ID=open_webui \
-e APP_SECRET=dev-secret \
-e NEXTCLOUD_URL=http://localhost:8080 \
-e OLLAMA_BASE_URL=http://host.docker.internal:11434 \
-p 9000:9000 \
-p 8080:8080 \
open-webui-exapp:dev# Health check
curl http://localhost:9000/heartbeat
# Open WebUI health
curl http://localhost:9000/health┌─────────────────────────────────────┐
│ Nextcloud + AppAPI │
└──────────────┬──────────────────────┘
│
▼
┌─────────────────────────────────────┐
│ Open WebUI ExApp Container │
│ ┌───────────────────────────────┐ │
│ │ FastAPI Wrapper (port 9000) │ │
│ │ - /heartbeat │ │
│ │ - /init │ │
│ │ - /enabled │ │
│ │ - /* (proxy to Open WebUI) │ │
│ └───────────────┬───────────────┘ │
│ │ │
│ ┌───────────────▼───────────────┐ │
│ │ Open WebUI (port 8080) │ │
│ │ ┌─────────────────────────┐ │ │
│ │ │ SQLite Database │ │ │
│ │ │ /data/webui.db │ │ │
│ │ └─────────────────────────┘ │ │
│ └───────────────────────────────┘ │
└──────────────┬──────────────────────┘
│
▼
┌─────────────────────────────────────┐
│ Ollama ExApp (LLM Backend) │
└─────────────────────────────────────┘
For the best experience, install all three ExApps:
- Ollama ExApp - Local LLM inference
- Open WebUI ExApp - Chat interface (this app)
- n8n ExApp - Workflow automation with AI
AGPL-3.0 - See LICENSE for details.