203 lines
No EOL
8.3 KiB
Text
203 lines
No EOL
8.3 KiB
Text
---
|
|
title: Install OpenRAG containers
|
|
slug: /docker
|
|
---
|
|
|
|
import Tabs from '@theme/Tabs';
|
|
import TabItem from '@theme/TabItem';
|
|
import PartialOnboarding from '@site/docs/_partial-onboarding.mdx';
|
|
import PartialWsl from '@site/docs/_partial-wsl-install.mdx';
|
|
|
|
OpenRAG has two Docker Compose files. Both files deploy the same applications and containers locally, but they are for different environments:
|
|
|
|
- [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment with GPU support for accelerated AI processing. This Docker Compose file requires an NVIDIA GPU with [CUDA](https://docs.nvidia.com/cuda/) support.
|
|
|
|
- [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without NVIDIA GPU support. Use this Docker Compose file for environments where GPU drivers aren't available.
|
|
|
|
## Prerequisites
|
|
|
|
- Install the following:
|
|
|
|
- [Python](https://www.python.org/downloads/release/python-3100/) version 3.13 or later.
|
|
- [uv](https://docs.astral.sh/uv/getting-started/installation/).
|
|
- [Podman](https://podman.io/docs/installation) (recommended) or [Docker](https://docs.docker.com/get-docker/).
|
|
- [`podman-compose`](https://docs.podman.io/en/latest/markdown/podman-compose.1.html) or [Docker Compose](https://docs.docker.com/compose/install/). To use Docker Compose with Podman, you must alias Docker Compose commands to Podman commands.
|
|
|
|
- Microsoft Windows only: To run OpenRAG on Windows, you must use the Windows Subsystem for Linux (WSL).
|
|
|
|
<details>
|
|
<summary>Install WSL for OpenRAG</summary>
|
|
|
|
<PartialWsl />
|
|
|
|
</details>
|
|
|
|
- Prepare model providers and credentials.
|
|
|
|
During [Application Onboarding](#application-onboarding), you must select language model and embedding model providers.
|
|
If your chosen provider offers both types, you can use the same provider for both selections.
|
|
If your provider offers only one type, such as Anthropic, you must select two providers.
|
|
|
|
Gather the credentials and connection details for your chosen model providers before starting onboarding:
|
|
|
|
- OpenAI: Create an [OpenAI API key](https://platform.openai.com/api-keys).
|
|
- Anthropic language models: Create an [Anthropic API key](https://www.anthropic.com/docs/api/reference).
|
|
- IBM watsonx.ai: Get your watsonx.ai API endpoint, IBM project ID, and IBM API key from your watsonx deployment.
|
|
- Ollama: Use the [Ollama documentation](https://docs.ollama.com/) to set up your Ollama instance locally, in the cloud, or on a remote server, and then get your Ollama server's base URL.
|
|
|
|
- Optional: Install GPU support with an NVIDIA GPU, [CUDA](https://docs.nvidia.com/cuda/) support, and compatible NVIDIA drivers on the OpenRAG host machine. This is required to use the GPU-accelerated Docker Compose file. If you choose not to use GPU support, you must use the CPU-only Docker Compose file instead.
|
|
|
|
## Install OpenRAG with Docker Compose
|
|
|
|
To install OpenRAG with Docker Compose, do the following:
|
|
|
|
1. Clone the OpenRAG repository.
|
|
```bash
|
|
git clone https://github.com/langflow-ai/openrag.git
|
|
cd openrag
|
|
```
|
|
|
|
2. Install dependencies.
|
|
```bash
|
|
uv sync
|
|
```
|
|
|
|
3. Copy the example `.env` file included in the repository root.
|
|
The example file includes all environment variables with comments to guide you in finding and setting their values.
|
|
```bash
|
|
cp .env.example .env
|
|
```
|
|
|
|
Alternatively, create a new `.env` file in the repository root.
|
|
```
|
|
touch .env
|
|
```
|
|
|
|
4. The Docker Compose files are populated with the values from your `.env` file.
|
|
The `OPENSEARCH_PASSWORD` value must be set.
|
|
`OPENSEARCH_PASSWORD` can be automatically generated when using the TUI, but for a Docker Compose installation, you can set it manually instead. To generate an OpenSearch admin password, see the [OpenSearch documentation](https://docs.opensearch.org/latest/security/configuration/demo-configuration/#setting-up-a-custom-admin-password).
|
|
|
|
The following values are optional:
|
|
|
|
```bash
|
|
OPENAI_API_KEY=your_openai_api_key
|
|
LANGFLOW_SECRET_KEY=your_secret_key
|
|
```
|
|
|
|
`OPENAI_API_KEY` is optional. You can provide it during [Application Onboarding](#application-onboarding) or choose a different model provider. If you want to set it in your `.env` file, you can find your OpenAI API key in your [OpenAI account](https://platform.openai.com/api-keys).
|
|
|
|
`LANGFLOW_SECRET_KEY` is optional. Langflow will auto-generate it if not set. For more information, see the [Langflow documentation](https://docs.langflow.org/api-keys-and-authentication#langflow-secret-key).
|
|
|
|
The following Langflow configuration values are optional but important to consider:
|
|
|
|
```bash
|
|
LANGFLOW_SUPERUSER=admin
|
|
LANGFLOW_SUPERUSER_PASSWORD=your_langflow_password
|
|
```
|
|
|
|
`LANGFLOW_SUPERUSER` defaults to `admin`. You can omit it or set it to a different username. `LANGFLOW_SUPERUSER_PASSWORD` is optional. If omitted, Langflow runs in [autologin mode](https://docs.langflow.org/api-keys-and-authentication#langflow-auto-login) with no password required. If set, Langflow requires password authentication.
|
|
|
|
For more information on configuring OpenRAG with environment variables, see [Environment variables](/reference/configuration).
|
|
|
|
5. Start `docling serve` on the host machine.
|
|
OpenRAG Docker installations require that `docling serve` is running on port 5001 on the host machine.
|
|
This enables [Mac MLX](https://opensource.apple.com/projects/mlx/) support for document processing.
|
|
|
|
```bash
|
|
uv run python scripts/docling_ctl.py start --port 5001
|
|
```
|
|
|
|
6. Confirm `docling serve` is running.
|
|
```
|
|
uv run python scripts/docling_ctl.py status
|
|
```
|
|
|
|
Make sure the response shows that `docling serve` is running, for example:
|
|
```bash
|
|
Status: running
|
|
Endpoint: http://127.0.0.1:5001
|
|
Docs: http://127.0.0.1:5001/docs
|
|
PID: 27746
|
|
```
|
|
|
|
7. Deploy OpenRAG locally with Docker Compose based on your deployment type.
|
|
|
|
<Tabs groupId="Compose file">
|
|
<TabItem value="docker-compose.yml" label="docker-compose.yml" default>
|
|
```bash
|
|
docker compose build
|
|
docker compose up -d
|
|
```
|
|
</TabItem>
|
|
<TabItem value="docker-compose-cpu.yml" label="docker-compose-cpu.yml">
|
|
|
|
```bash
|
|
docker compose -f docker-compose-cpu.yml up -d
|
|
```
|
|
|
|
</TabItem>
|
|
</Tabs>
|
|
|
|
The OpenRAG Docker Compose file starts five containers:
|
|
| Container Name | Default Address | Purpose |
|
|
|---|---|---|
|
|
| OpenRAG Backend | http://localhost:8000 | FastAPI server and core functionality. |
|
|
| OpenRAG Frontend | http://localhost:3000 | React web interface for users. |
|
|
| Langflow | http://localhost:7860 | AI workflow engine and flow management. |
|
|
| OpenSearch | http://localhost:9200 | Vector database for document storage. |
|
|
| OpenSearch Dashboards | http://localhost:5601 | Database administration interface. |
|
|
|
|
8. Verify installation by confirming all services are running.
|
|
|
|
```bash
|
|
docker compose ps
|
|
```
|
|
|
|
You can now access OpenRAG at the following endpoints:
|
|
|
|
- **Frontend**: http://localhost:3000
|
|
- **Backend API**: http://localhost:8000
|
|
- **Langflow**: http://localhost:7860
|
|
|
|
9. Continue with [Application Onboarding](#application-onboarding).
|
|
|
|
To stop `docling serve` when you're done with your OpenRAG deployment, run:
|
|
|
|
```bash
|
|
uv run python scripts/docling_ctl.py stop
|
|
```
|
|
|
|
<PartialOnboarding />
|
|
|
|
## Container management commands
|
|
|
|
Manage your OpenRAG containers with the following commands.
|
|
These commands are also available in the TUI's [Status menu](/install#status).
|
|
|
|
### Upgrade containers
|
|
|
|
Upgrade your containers to the latest version while preserving your data.
|
|
|
|
```bash
|
|
docker compose pull
|
|
docker compose up -d --force-recreate
|
|
```
|
|
|
|
### Rebuild containers (destructive)
|
|
|
|
Reset state by rebuilding all of your containers.
|
|
Your OpenSearch and Langflow databases will be lost.
|
|
Documents stored in the `./documents` directory will persist, since the directory is mounted as a volume in the OpenRAG backend container.
|
|
|
|
```bash
|
|
docker compose up --build --force-recreate --remove-orphans
|
|
```
|
|
|
|
### Remove all containers and data (destructive)
|
|
|
|
Completely remove your OpenRAG installation and delete all data.
|
|
This deletes all of your data, including OpenSearch data, uploaded documents, and authentication.
|
|
```bash
|
|
docker compose down --volumes --remove-orphans --rmi local
|
|
docker system prune -f
|
|
``` |