Skip to main content

Environment variables

OpenRAG recognizes environment variables from the following sources:

Configure environment variables

Environment variables are set in a .env file in the root of your OpenRAG project directory.

For an example .env file, see .env.example in the OpenRAG repository.

The Docker Compose files are populated with values from your .env, so you don't need to edit the Docker Compose files manually.

Environment variables always take precedence over other variables.

Set environment variables

To set environment variables, do the following.

  1. Stop OpenRAG.
  2. Set the values in the .env file:
    LOG_LEVEL=DEBUG
    LOG_FORMAT=json
    SERVICE_NAME=openrag-dev
  3. Start OpenRAG.

Updating provider API keys or provider endpoints in the .env file will not take effect after Application onboarding. To change these values, you must:

  1. Stop OpenRAG.
  2. Remove the containers:
    docker-compose down
  3. Update the values in your .env file.
  4. Start OpenRAG containers.
    docker-compose up -d
  5. Complete Application onboarding again.

Supported environment variables

All OpenRAG configuration can be controlled through environment variables.

AI provider settings

Configure which AI models and providers OpenRAG uses for language processing and embeddings. For more information, see Application onboarding.

VariableDefaultDescription
EMBEDDING_MODELtext-embedding-3-smallEmbedding model for vector search.
LLM_MODELgpt-4o-miniLanguage model for the chat agent.
MODEL_PROVIDERopenaiModel provider, such as OpenAI or IBM watsonx.ai.
OPENAI_API_KEY-Your OpenAI API key. Required.
PROVIDER_API_KEY-API key for the model provider.
PROVIDER_ENDPOINT-Custom provider endpoint. Only used for IBM or Ollama providers.
PROVIDER_PROJECT_ID-Project ID for providers. Only required for the IBM watsonx.ai provider.

Document processing

Control how OpenRAG processes and ingests documents into your knowledge base. For more information, see Ingestion.

VariableDefaultDescription
CHUNK_OVERLAP200Overlap between chunks.
CHUNK_SIZE1000Text chunk size for document processing.
DISABLE_INGEST_WITH_LANGFLOWfalseDisable Langflow ingestion pipeline.
DOCLING_OCR_ENGINE-OCR engine for document processing.
OCR_ENABLEDfalseEnable OCR for image processing.
OPENRAG_DOCUMENTS_PATHS./documentsDocument paths for ingestion.
PICTURE_DESCRIPTIONS_ENABLEDfalseEnable picture descriptions.

Langflow settings

Configure Langflow authentication.

VariableDefaultDescription
LANGFLOW_AUTO_LOGINFalseEnable auto-login for Langflow.
LANGFLOW_CHAT_FLOW_IDpre-filledThis value is pre-filled. The default value is found in .env.example.
LANGFLOW_ENABLE_SUPERUSER_CLIFalseEnable superuser CLI.
LANGFLOW_INGEST_FLOW_IDpre-filledThis value is pre-filled. The default value is found in .env.example.
LANGFLOW_KEYauto-generatedExplicit Langflow API key.
LANGFLOW_NEW_USER_IS_ACTIVEFalseNew users are active by default.
LANGFLOW_PUBLIC_URLhttp://localhost:7860Public URL for Langflow.
LANGFLOW_SECRET_KEY-Secret key for Langflow internal operations.
LANGFLOW_SUPERUSER-Langflow admin username. Required.
LANGFLOW_SUPERUSER_PASSWORD-Langflow admin password. Required.
LANGFLOW_URLhttp://localhost:7860Langflow URL.
NUDGES_FLOW_IDpre-filledThis value is pre-filled. The default value is found in .env.example.
SYSTEM_PROMPT"You are a helpful AI assistant with access to a knowledge base. Answer questions based on the provided context."System prompt for the Langflow agent.

OAuth provider settings

Configure OAuth providers and external service integrations.

VariableDefaultDescription
AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY-AWS integrations.
GOOGLE_OAUTH_CLIENT_ID / GOOGLE_OAUTH_CLIENT_SECRET-Google OAuth authentication.
MICROSOFT_GRAPH_OAUTH_CLIENT_ID / MICROSOFT_GRAPH_OAUTH_CLIENT_SECRET-Microsoft OAuth.
WEBHOOK_BASE_URL-Base URL for webhook endpoints.

OpenSearch settings

Configure OpenSearch database authentication.

VariableDefaultDescription
OPENSEARCH_HOSTlocalhostOpenSearch host.
OPENSEARCH_PASSWORD-Password for OpenSearch admin user. Required.
OPENSEARCH_PORT9200OpenSearch port.
OPENSEARCH_USERNAMEadminOpenSearch username.

System settings

Configure general system components, session management, and logging.

VariableDefaultDescription
LANGFLOW_KEY_RETRIES15Number of retries for Langflow key generation.
LANGFLOW_KEY_RETRY_DELAY2.0Delay between retries in seconds.
LANGFLOW_VERSIONlatestLangflow Docker image version.
LOG_FORMAT-Log format (set to "json" for JSON output).
LOG_LEVELINFOLogging level (DEBUG, INFO, WARNING, ERROR).
MAX_WORKERS-Maximum number of workers for document processing.
OPENRAG_VERSIONlatestOpenRAG Docker image version.
SERVICE_NAMEopenragService name for logging.
SESSION_SECRETauto-generatedSession management.

Langflow runtime overrides

Langflow runtime overrides allow you to modify component settings at runtime without changing the base configuration.

Runtime overrides are implemented through tweaks - parameter modifications that are passed to specific Langflow components during flow execution.

For more information on tweaks, see Input schema (tweaks).

Default values and fallbacks

When no environment variables or configuration file values are provided, OpenRAG uses default values. These values can be found in the code base at the following locations.

OpenRAG configuration defaults

These values are defined in config_manager.py in the OpenRAG repository.

System configuration defaults

These fallback values are defined in settings.py in the OpenRAG repository.