Install OpenRAG
OpenRAG can be installed in multiple ways:
-
Python wheel: Install the OpenRAG Python wheel and use the OpenRAG Terminal User Interface (TUI) to install, run, and configure your OpenRAG deployment without running Docker commands.
-
Docker Compose: Clone the OpenRAG repository and deploy OpenRAG with Docker Compose, including all services and dependencies.
Prerequisites
- Python Version 3.10 to 3.13
- uv
- Docker or Podman installed
- Docker Compose installed. If using Podman, use podman-compose or alias Docker compose commands to Podman commands.
- For GPU support: (TBD)
Python wheel
The Python wheel is currently available internally, but will be available on PyPI at launch. The wheel installs the OpenRAG wheel, which includes the TUI for installing, running, and managing OpenRAG. For more information on virtual environments, see uv.
-
Create a new project with a virtual environment using uv.
uv init YOUR_PROJECT_NAME
cd YOUR_PROJECT_NAME -
Add the OpenRAG wheel to your project and install it in the virtual environment. Replace
PATH/TO/andVERSIONwith your OpenRAG wheel location and version.uv add PATH/TO/openrag-VERSION-py3-none-any.whl -
Ensure all dependencies are installed and updated in your virtual environment.
uv sync -
Start the OpenRAG TUI.
uv run openragThe OpenRAG TUI opens.
-
To install OpenRAG with Basic Setup, click Basic Setup or press 1. Basic Setup does not set up OAuth connections for ingestion from Google Drive, OneDrive, or AWS. For OAuth setup, see Advanced Setup. The TUI prompts you for the required startup values. Click Generate Passwords to autocomplete fields that contain Auto-generated Secure Password, or bring your own passwords.
Where do I find the required startup values?
Variable Where to Find Description OPENSEARCH_PASSWORDAuto-generated secure password The password for OpenSearch database access. Must be at least 8 characters and must contain at least one uppercase letter, one lowercase letter, one digit, and one special character. OPENAI_API_KEYOpenAI Platform API key from your OpenAI account. LANGFLOW_SUPERUSERUser generated Username for Langflow admin access. For more, see Langflow docs. LANGFLOW_SUPERUSER_PASSWORDAuto-generated secure password Password for Langflow admin access. For more, see the Langflow docs. LANGFLOW_SECRET_KEYAuto-generated secure key Secret key for Langflow security. For more, see the Langflow docs. LANGFLOW_AUTO_LOGINAuto-generated or manual Auto-login configuration. For more, see the Langflow docs. LANGFLOW_NEW_USER_IS_ACTIVELangflow New user activation setting. For more, see the Langflow docs. LANGFLOW_ENABLE_SUPERUSER_CLILangflow server Superuser CLI access setting. For more, see the Langflow docs. DOCUMENTS_PATHSet your local path Path to your document storage directory. To complete credentials, click Save Configuration.
-
To start OpenRAG with your credentials, click Start Container Services. Startup pulls container images and starts them, so it can take some time. The operation has completed when the Close button is available and the terminal displays:
Services started successfully
Command completed successfully -
To open the OpenRAG application, click Open App, press 6, or navigate to
http://localhost:3000. The application opens. -
Select your language model and embedding model provider, and complete the required fields. Your provider can only be selected once, and you must use the same provider for your language model and embedding model. The language model can be changed, but the embeddings model cannot be changed. To change your provider selection, you must restart OpenRAG and delete the
config.ymlfile.
- OpenAI
- IBM watsonx.ai
- Ollama
- If you already entered a value for
OPENAI_API_KEYin the TUI in Step 5, enable Get API key from environment variable. - Under Advanced settings, select your Embedding Model and Language Model.
- To load 2 sample PDFs, enable Sample dataset. This is recommended, but not required.
- Click Complete.
- Complete the fields for watsonx.ai API Endpoint, IBM API key, and IBM Project ID. These values are found in your IBM watsonx deployment.
- Under Advanced settings, select your Embedding Model and Language Model.
- To load 2 sample PDFs, enable Sample dataset. This is recommended, but not required.
- Click Complete.
- Enter your Ollama server's base URL address.
The default Ollama server address is
http://localhost:11434. Since OpenRAG is running in a container, you may need to changelocalhostto access services outside of the container. For example, changehttp://localhost:11434tohttp://host.docker.internal:11434to connect to Ollama. OpenRAG automatically sends a test connection to your Ollama server to confirm connectivity. - Select the Embedding Model and Language Model your Ollama server is running. OpenRAG automatically lists the available models from your Ollama server.
- To load 2 sample PDFs, enable Sample dataset. This is recommended, but not required.
- Click Complete.
- Continue with the Quickstart.
Advanced Setup
Advanced Setup includes the required values from Basic Setup, with additional settings for OAuth credentials. If the OpenRAG TUI detects OAuth credentials, it enforces the Advanced Setup path.
- Add your client and secret values for Google, Azure, or AWS OAuth. These values can be found in your OAuth provider.
- The OpenRAG TUI presents redirect URIs for your OAuth app. These are the URLs your OAuth provider will redirect back to after user sign-in. Register these redirect values with your OAuth provider as they are presented in the TUI.
- To open the OpenRAG application, click Open App or press 6. You will be presented with your provider's OAuth sign-in screen, and be redirected to the redirect URI after sign-in.
Two additional variables are available for Advanced Setup:
The LANGFLOW_PUBLIC_URL controls where the Langflow web interface can be accessed. This is where users interact with their flows in a browser.
The WEBHOOK_BASE_URL controls where the endpoint for /connectors/CONNECTOR_TYPE/webhook will be available.
This connection enables real-time document synchronization with external services.
For example, for Google Drive file synchronization the webhook URL is /connectors/google_drive/webhook.