--- title: Environment variables slug: /reference/configuration --- import Icon from "@site/src/components/icon/icon"; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; OpenRAG recognizes environment variables from the following sources: * [Environment variables](#configure-environment-variables): Values set in the `.env` file. * [Langflow runtime overrides](#langflow-runtime-overrides): Langflow components can set environment variables at runtime. * [Default or fallback values](#default-values-and-fallbacks): These values are default or fallback values if OpenRAG doesn't find a value. ## Configure environment variables Environment variables are set in a `.env` file in the root of your OpenRAG project directory. For an example `.env` file, see [`.env.example` in the OpenRAG repository](https://github.com/langflow-ai/openrag/blob/main/.env.example). The Docker Compose files are populated with values from your `.env`, so you don't need to edit the Docker Compose files manually. Environment variables always take precedence over other variables. ### Set environment variables {#set-environment-variables} After you start OpenRAG, you must [stop and restart OpenRAG containers](/install#tui-container-management) to apply any changes you make to the `.env` file. To set mutable environment variables, do the following: 1. Stop OpenRAG with the TUI or Docker Compose. 2. Set the values in the `.env` file: ```bash LOG_LEVEL=DEBUG LOG_FORMAT=json SERVICE_NAME=openrag-dev ``` 3. Start OpenRAG with the TUI or Docker Compose. Certain environment variables that you set during [application onboarding](/install#application-onboarding), such as provider API keys and provider endpoints, require resetting the containers after modifying the `.env` file. To change immutable variables with TUI-managed containers, you must [reinstall OpenRAG](/install#reinstall) and either delete or modify the `.env` file before you repeat the setup and onboarding process in the TUI. To change immutable variables with self-managed containers, do the following: 1. Stop OpenRAG with Docker Compose. 2. Remove the containers: ```bash docker-compose down ``` 3. Update the values in your `.env` file. 4. Start OpenRAG with Docker Compose: ```bash docker-compose up -d ``` 5. Repeat [application onboarding](/install#application-onboarding). The values in your `.env` file are automatically populated. ## Supported environment variables All OpenRAG configuration can be controlled through environment variables. ### AI provider settings Configure which models and providers OpenRAG uses to generate text and embeddings. These are initially set during [application onboarding](/install#application-onboarding). Some values are immutable and can only be changed by recreating the OpenRAG containers, as explained in [Set environment variables](#set-environment-variables). | Variable | Default | Description | |----------|---------|-------------| | `EMBEDDING_MODEL` | `text-embedding-3-small` | Embedding model for generating vector embeddings for documents in the knowledge base and similarity search queries. Can be changed after application onboarding. Accepts one or more models. | | `LLM_MODEL` | `gpt-4o-mini` | Language model for language processing and text generation in the **Chat** feature. | | `MODEL_PROVIDER` | `openai` | Model provider, such as OpenAI or IBM watsonx.ai. | | `OPENAI_API_KEY` | Not set | Optional OpenAI API key for the default model. For other providers, use `PROVIDER_API_KEY`. | | `PROVIDER_API_KEY` | Not set | API key for the model provider. | | `PROVIDER_ENDPOINT` | Not set | Custom provider endpoint for the IBM and Ollama model providers. Leave unset for other model providers. | | `PROVIDER_PROJECT_ID` | Not set | Project ID for the IBM watsonx.ai model provider only. Leave unset for other model providers. | ### Document processing Control how OpenRAG [processes and ingests documents](/ingestion) into your knowledge base. | Variable | Default | Description | |----------|---------|-------------| | `CHUNK_OVERLAP` | `200` | Overlap between chunks. | | `CHUNK_SIZE` | `1000` | Text chunk size for document processing. | | `DISABLE_INGEST_WITH_LANGFLOW` | `false` | Disable Langflow ingestion pipeline. | | `DOCLING_OCR_ENGINE` | Set by OS | OCR engine for document processing. For macOS, `ocrmac`. For any other OS, `easyocr`. | | `OCR_ENABLED` | `false` | Enable OCR for image processing. | | `OPENRAG_DOCUMENTS_PATHS` | `./openrag-documents` | Document paths for ingestion. | | `PICTURE_DESCRIPTIONS_ENABLED` | `false` | Enable picture descriptions. | ### Langflow settings Configure Langflow authentication. | Variable | Default | Description | |----------|---------|-------------| | `LANGFLOW_AUTO_LOGIN` | `False` | Enable auto-login for Langflow. | | `LANGFLOW_CHAT_FLOW_ID` | Built-in flow ID | This value is automatically set to the ID of the chat [flow](/agents). The default value is found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change this value if you explicitly don't want to use this built-in flow. | | `LANGFLOW_ENABLE_SUPERUSER_CLI` | `False` | Enable superuser privileges for Langflow CLI commands. | | `LANGFLOW_INGEST_FLOW_ID` | Built-in flow ID | This value is automatically set to the ID of the ingestion [flow](/agents). The default value is found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change this value if you explicitly don't want to use this built-in flow. | | `LANGFLOW_KEY` | Automatically generated | Explicit Langflow API key. | | `LANGFLOW_NEW_USER_IS_ACTIVE` | `False` | Whether new Langflow users are active by default. | | `LANGFLOW_PUBLIC_URL` | `http://localhost:7860` | Public URL for the Langflow instance. | | `LANGFLOW_SECRET_KEY` | Not set | Secret key for Langflow internal operations. | | `LANGFLOW_SUPERUSER` | None, must be explicitly set | Langflow admin username. Required. | | `LANGFLOW_SUPERUSER_PASSWORD` | None, must be explicitly set | Langflow admin password. Required. | | `LANGFLOW_URL` | `http://localhost:7860` | URL for the Langflow instance. | | `NUDGES_FLOW_ID` | Built-in flow ID | This value is automatically set to the ID of the nudges [flow](/agents). The default value is found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change this value if you explicitly don't want to use this built-in flow. | | `SYSTEM_PROMPT` | `You are a helpful AI assistant with access to a knowledge base. Answer questions based on the provided context.` | System prompt instructions for the agent driving the **Chat** flow. | ### OAuth provider settings Configure OAuth providers and external service integrations. | Variable | Default | Description | |----------|---------|-------------| | `AWS_ACCESS_KEY_ID` / `AWS_SECRET_ACCESS_KEY` | - | AWS integrations. | | `GOOGLE_OAUTH_CLIENT_ID` / `GOOGLE_OAUTH_CLIENT_SECRET` | - | Google OAuth authentication. | | `MICROSOFT_GRAPH_OAUTH_CLIENT_ID` / `MICROSOFT_GRAPH_OAUTH_CLIENT_SECRET` | - | Microsoft OAuth. | | `WEBHOOK_BASE_URL` | - | Base URL for webhook endpoints. | ### OpenSearch settings Configure OpenSearch database authentication. | Variable | Default | Description | |----------|---------|-------------| | `OPENSEARCH_HOST` | `localhost` | OpenSearch host. | | `OPENSEARCH_PASSWORD` | - | Password for OpenSearch admin user. Required. | | `OPENSEARCH_PORT` | `9200` | OpenSearch port. | | `OPENSEARCH_USERNAME` | `admin` | OpenSearch username. | ### System settings Configure general system components, session management, and logging. | Variable | Default | Description | |----------|---------|-------------| | `LANGFLOW_KEY_RETRIES` | `15` | Number of retries for Langflow key generation. | | `LANGFLOW_KEY_RETRY_DELAY` | `2.0` | Delay between retries in seconds. | | `LANGFLOW_VERSION` | `OPENRAG_VERSION` | Langflow Docker image version. By default, OpenRAG uses the `OPENRAG_VERSION` for the Langflow Docker image version. | | `LOG_FORMAT` | Disabled | Set to `json` to enabled JSON-formatted log output. | | `LOG_LEVEL` | `INFO` | Logging level (DEBUG, INFO, WARNING, ERROR). | | `MAX_WORKERS` | `1` | Maximum number of workers for document processing. | | `OPENRAG_VERSION` | `latest` | The version of the OpenRAG Docker images to run. For more information, see [Upgrade OpenRAG](/install#upgrade) | | `SERVICE_NAME` | `openrag` | Service name for logging. | | `SESSION_SECRET` | Automatically generated | Session management. | ## Langflow runtime overrides You can modify [flow](/agents) settings at runtime without permanently changing the flow's configuration. Runtime overrides are implemented through _tweaks_, which are one-time parameter modifications that are passed to specific Langflow components during flow execution. For more information on tweaks, see the Langflow documentation on [Input schema (tweaks)](https://docs.langflow.org/concepts-publish#input-schema). ## Default values and fallbacks If a variable isn't set by environment variables or a configuration file, OpenRAG can use a default value if one is defined in the codebase. Default values can be found in the OpenRAG repository: * OpenRAG configuration: [`config_manager.py`](https://github.com/langflow-ai/openrag/blob/main/src/config/config_manager.py) * System configuration: [`settings.py`](https://github.com/langflow-ai/openrag/blob/main/src/config/settings.py) * Logging configuration: [`logging_config.py`](https://github.com/langflow-ai/openrag/blob/main/src/utils/logging_config.py)