From 4538433720c2bfb94c4d58ea364acc8cd0d85a94 Mon Sep 17 00:00:00 2001
From: April M <36110273+aimurphy@users.noreply.github.com>
Date: Fri, 5 Dec 2025 11:23:20 -0800
Subject: [PATCH] environment variables
---
docs/docs/_partial-integrate-chat.mdx | 4 +-
docs/docs/_partial-onboarding.mdx | 6 +-
docs/docs/_partial-prereq-common.mdx | 20 +++---
docs/docs/_partial-prereq-no-script.mdx | 8 +--
docs/docs/_partial-prereq-python.mdx | 2 +-
docs/docs/_partial-prereq-windows.mdx | 2 +-
docs/docs/_partial-setup.mdx | 18 +++---
docs/docs/core-components/knowledge.mdx | 4 +-
docs/docs/get-started/docker.mdx | 76 ++++++++++++-----------
docs/docs/get-started/install-options.mdx | 8 +--
docs/docs/get-started/install-uv.mdx | 2 +-
docs/docs/get-started/install-uvx.mdx | 2 +-
docs/docs/get-started/install.mdx | 2 +-
docs/docs/get-started/quickstart.mdx | 2 +-
docs/docs/reference/configuration.mdx | 54 +++++++++-------
15 files changed, 111 insertions(+), 99 deletions(-)
diff --git a/docs/docs/_partial-integrate-chat.mdx b/docs/docs/_partial-integrate-chat.mdx
index de3d9a62..867ecd27 100644
--- a/docs/docs/_partial-integrate-chat.mdx
+++ b/docs/docs/_partial-integrate-chat.mdx
@@ -4,8 +4,8 @@ import TabItem from '@theme/TabItem';
1. Open the **OpenRAG OpenSearch Agent** flow in the Langflow visual editor: From the **Chat** window, click **Settings**, click **Edit in Langflow**, and then click **Proceed**.
-2. Create a [Langflow API key](https://docs.langflow.org/api-keys-and-authentication), which is a user-specific token required to send requests to the Langflow server.
-This key doesn't grant access to OpenRAG.
+2. Optional: If you don't want to use the Langflow API key that is generated automatically when you install OpenRAG, you can create a [Langflow API key](https://docs.langflow.org/api-keys-and-authentication).
+This key doesn't grant access to OpenRAG; it is only for authenticating with the Langflow API.
1. In the Langflow visual editor, click your user icon in the header, and then select **Settings**.
2. Click **Langflow API Keys**, and then click **Add New**.
diff --git a/docs/docs/_partial-onboarding.mdx b/docs/docs/_partial-onboarding.mdx
index a8628648..ca8a994f 100644
--- a/docs/docs/_partial-onboarding.mdx
+++ b/docs/docs/_partial-onboarding.mdx
@@ -31,7 +31,7 @@ Anthropic doesn't provide embedding models. If you select Anthropic for your lan
3. Click **Complete**.
4. Select a provider for embeddings, provide the required information, and then select the embedding model you want to use.
-For information about another provider's credentials and settings, see the instructions for your chosen provider.
+For information about another provider's credentials and settings, see the instructions for that provider.
5. Continue through the overview slides for a brief introduction to OpenRAG, or click **Skip overview**.
The overview demonstrates some basic functionality that is covered in the [quickstart](/quickstart#chat-with-documents) and in other parts of the OpenRAG documentation.
@@ -46,7 +46,7 @@ The overview demonstrates some basic functionality that is covered in the [quick
3. Click **Complete**.
4. Select a provider for embeddings, provide the required information, and then select the embedding model you want to use.
-For information about another provider's credentials and settings, see the instructions for your chosen provider.
+For information about another provider's credentials and settings, see the instructions for that provider.
5. Continue through the overview slides for a brief introduction to OpenRAG, or click **Skip overview**.
The overview demonstrates some basic functionality that is covered in the [quickstart](/quickstart#chat-with-documents) and in other parts of the OpenRAG documentation.
@@ -97,7 +97,7 @@ The overview demonstrates some basic functionality that is covered in the [quick
3. Click **Complete**.
4. Select a provider for embeddings, provide the required information, and then select the embedding model you want to use.
-For information about another provider's credentials and settings, see the instructions for your chosen provider.
+For information about another provider's credentials and settings, see the instructions for that provider.
5. Continue through the overview slides for a brief introduction to OpenRAG, or click **Skip overview**.
The overview demonstrates some basic functionality that is covered in the [quickstart](/quickstart#chat-with-documents) and in other parts of the OpenRAG documentation.
diff --git a/docs/docs/_partial-prereq-common.mdx b/docs/docs/_partial-prereq-common.mdx
index 16e7bcdd..66374fcc 100644
--- a/docs/docs/_partial-prereq-common.mdx
+++ b/docs/docs/_partial-prereq-common.mdx
@@ -1,12 +1,12 @@
-- Gather the credentials and connection details for your preferred model providers.
+* Gather the credentials and connection details for your preferred model providers.
+You must have access to at least one language model and one embedding model.
+If a provider offers both types, you can use the same provider for both models.
+If a provider offers only one type, you must select two providers.
- - OpenAI: Create an [OpenAI API key](https://platform.openai.com/api-keys).
- - Anthropic language models: Create an [Anthropic API key](https://www.anthropic.com/docs/api/reference).
- - IBM watsonx.ai: Get your watsonx.ai API endpoint, IBM project ID, and IBM API key from your watsonx deployment.
- - Ollama: Use the [Ollama documentation](https://docs.ollama.com/) to set up your Ollama instance locally, in the cloud, or on a remote server, and then get your Ollama server's base URL.
+ * **OpenAI**: Create an [OpenAI API key](https://platform.openai.com/api-keys).
+ * **Anthropic**: Create an [Anthropic API key](https://www.anthropic.com/docs/api/reference).
+ Anthropic provides language models only; you must select an additional provider for embeddings.
+ * **IBM watsonx.ai**: Get your watsonx.ai API endpoint, IBM project ID, and IBM API key from your watsonx deployment.
+ * **Ollama**: Deploy an [Ollama instance and models](https://docs.ollama.com/) locally, in the cloud, or on a remote server, and then get your Ollama server's base URL and the names of the models that you want to use.
- You must have access to at least one language model and one embedding model.
- If your chosen provider offers both types, you can use the same provider for both models.
- If your provider offers only one type, such as Anthropic, you must select two providers.
-
-- Optional: Install GPU support with an NVIDIA GPU, [CUDA](https://docs.nvidia.com/cuda/) support, and compatible NVIDIA drivers on the OpenRAG host machine. If you don't have GPU capabilities, OpenRAG provides an alternate CPU-only deployment.
\ No newline at end of file
+* Optional: Install GPU support with an NVIDIA GPU, [CUDA](https://docs.nvidia.com/cuda/) support, and compatible NVIDIA drivers on the OpenRAG host machine. If you don't have GPU capabilities, OpenRAG provides an alternate CPU-only deployment.
\ No newline at end of file
diff --git a/docs/docs/_partial-prereq-no-script.mdx b/docs/docs/_partial-prereq-no-script.mdx
index 83c65495..a8fd8349 100644
--- a/docs/docs/_partial-prereq-no-script.mdx
+++ b/docs/docs/_partial-prereq-no-script.mdx
@@ -1,6 +1,6 @@
-- Install [uv](https://docs.astral.sh/uv/getting-started/installation/).
+* Install [uv](https://docs.astral.sh/uv/getting-started/installation/).
-- Install [Podman](https://podman.io/docs/installation) (recommended) or [Docker](https://docs.docker.com/get-docker/).
+* Install [Podman](https://podman.io/docs/installation) (recommended) or [Docker](https://docs.docker.com/get-docker/).
-- Install [`podman-compose`](https://docs.podman.io/en/latest/markdown/podman-compose.1.html) or [Docker Compose](https://docs.docker.com/compose/install/).
-To use Docker Compose with Podman, you must alias Docker Compose commands to Podman commands.
\ No newline at end of file
+* Install [`podman-compose`](https://docs.podman.io/en/latest/markdown/podman-compose.1.html) or [Docker Compose](https://docs.docker.com/compose/install/).
+To use Docker Compose with Podman, you must alias Docker Compose commands to Podman commands.
\ No newline at end of file
diff --git a/docs/docs/_partial-prereq-python.mdx b/docs/docs/_partial-prereq-python.mdx
index b386a3ab..77036589 100644
--- a/docs/docs/_partial-prereq-python.mdx
+++ b/docs/docs/_partial-prereq-python.mdx
@@ -1 +1 @@
-- Install [Python](https://www.python.org/downloads/release/python-3100/) version 3.13 or later.
\ No newline at end of file
+* Install [Python](https://www.python.org/downloads/release/python-3100/) version 3.13 or later.
\ No newline at end of file
diff --git a/docs/docs/_partial-prereq-windows.mdx b/docs/docs/_partial-prereq-windows.mdx
index 8187f44f..eeb671c1 100644
--- a/docs/docs/_partial-prereq-windows.mdx
+++ b/docs/docs/_partial-prereq-windows.mdx
@@ -1,2 +1,2 @@
-- For Microsoft Windows, you must use the Windows Subsystem for Linux (WSL).
+* For Microsoft Windows, you must use the Windows Subsystem for Linux (WSL).
See [Install OpenRAG on Windows](/install-windows) before proceeding.
\ No newline at end of file
diff --git a/docs/docs/_partial-setup.mdx b/docs/docs/_partial-setup.mdx
index 14a2bc8b..056e58dd 100644
--- a/docs/docs/_partial-setup.mdx
+++ b/docs/docs/_partial-setup.mdx
@@ -19,8 +19,8 @@ If OpenRAG detects OAuth credentials during setup, it recommends **Advanced Setu
The OpenSearch password is required.
- The Langflow password is optional.
- If the Langflow password is empty, Langflow runs in [autologin mode](https://docs.langflow.org/api-keys-and-authentication#langflow-auto-login) without password authentication.
+ The Langflow password is recommended but optional.
+ If the Langflow password is empty, the Langflow server starts without authentication enabled. For more information, see [Langflow settings](/reference/configuration#langflow-settings).
3. Optional: Enter your OpenAI API key, or leave this field empty if you want to configure model provider credentials later during application onboarding.
@@ -57,8 +57,8 @@ If OpenRAG detects OAuth credentials during setup, it recommends **Advanced Setu
The OpenSearch password is required.
- The Langflow password is optional.
- If the Langflow password is empty, Langflow runs in [autologin mode](https://docs.langflow.org/api-keys-and-authentication#langflow-auto-login) without password authentication.
+ The Langflow password is recommended but optional.
+ If the Langflow password is empty, the Langflow server starts without authentication enabled. For more information, see [Langflow settings](/reference/configuration#langflow-settings).
3. Optional: Enter your OpenAI API key, or leave this field empty if you want to configure model provider credentials later during application onboarding.
@@ -76,7 +76,7 @@ If OpenRAG detects OAuth credentials during setup, it recommends **Advanced Setu
6. Click **Save Configuration**.
- Your passwords, API key (if provided), and OAuth credentials (if provided) are stored in the `.env` file in your OpenRAG installation directory.
+ Your passwords, API key, and OAuth credentials, if provided, are stored in the `.env` file in your OpenRAG installation directory.
If you modified any credentials that were pulled from an existing `.env` file, those values are updated in the `.env` file.
7. Click **Start All Services** to start the OpenRAG services that run in containers.
@@ -105,10 +105,10 @@ If OpenRAG detects OAuth credentials during setup, it recommends **Advanced Setu
* `WEBHOOK_BASE_URL`: Sets the base address for the following OpenRAG OAuth connector endpoints:
- - Amazon S3: Not applicable.
- - Google Drive: `WEBHOOK_BASE_URL/connectors/google_drive/webhook`
- - OneDrive: `WEBHOOK_BASE_URL/connectors/onedrive/webhook`
- - SharePoint: `WEBHOOK_BASE_URL/connectors/sharepoint/webhook`
+ * Amazon S3: Not applicable.
+ * Google Drive: `WEBHOOK_BASE_URL/connectors/google_drive/webhook`
+ * OneDrive: `WEBHOOK_BASE_URL/connectors/onedrive/webhook`
+ * SharePoint: `WEBHOOK_BASE_URL/connectors/sharepoint/webhook`
12. Continue with [application onboarding](#application-onboarding).
diff --git a/docs/docs/core-components/knowledge.mdx b/docs/docs/core-components/knowledge.mdx
index 0d4b2f05..cfb2715c 100644
--- a/docs/docs/core-components/knowledge.mdx
+++ b/docs/docs/core-components/knowledge.mdx
@@ -34,12 +34,12 @@ When you [install OpenRAG](/install-options), you provide the initial configurat
This includes authentication credentials for OpenSearch and OAuth connectors.
This configuration determines how OpenRAG authenticates with OpenSearch and controls access to documents in your knowledge base:
-* **No-auth mode (basic setup)**: If you choose **Basic Setup** in the [TUI](/tui), or your `.env` file doesn't include OAuth credentials, then OpenRAG runs in no-auth mode.
+* **No-auth mode (basic setup)**: If you select **Basic Setup** in the [TUI](/tui), or your `.env` file doesn't include OAuth credentials, then the OpenRAG OpenSearch instance runs in no-auth mode.
This mode uses one anonymous JWT token for OpenSearch authentication.
There is no differentiation between users; all users that access your OpenRAG instance can access all documents uploaded to your knowledge base.
-* **OAuth mode (advanced setup)**: If you choose **Advanced Setup** in the [TUI](/tui), or your `.env` file includes OAuth credentials, then OpenRAG runs in OAuth mode.
+* **OAuth mode (advanced setup)**: If you select **Advanced Setup** in the [TUI](/tui), or your `.env` file includes OAuth credentials, then the OpenRAG OpenSearch instance runs in OAuth mode.
This mode uses a unique JWT token for each OpenRAG user, and each document is tagged with user ownership.
Documents are filtered by user owner; users see only the documents that they uploaded or have access to through their cloud storage accounts.
diff --git a/docs/docs/get-started/docker.mdx b/docs/docs/get-started/docker.mdx
index b6700d5a..c204baf8 100644
--- a/docs/docs/get-started/docker.mdx
+++ b/docs/docs/get-started/docker.mdx
@@ -26,67 +26,73 @@ Use this installation method if you don't want to [use the Terminal User Interfa
-## Install OpenRAG with Docker Compose
+## Prepare your deployment
-To install OpenRAG with Docker Compose, do the following:
+1. Clone the OpenRAG repository:
-1. Clone the OpenRAG repository.
```bash
git clone https://github.com/langflow-ai/openrag.git
+ ```
+
+2. Change to the root of the cloned repository:
+
+ ```bash
cd openrag
```
-2. Install dependencies.
+3. Install dependencies:
+
```bash
uv sync
```
-3. Copy the example `.env` file included in the repository root.
- The example file includes all environment variables with comments to guide you in finding and setting their values.
+4. Create a `.env` file at the root of the cloned repository.
+
+ You can create an empty file or copy the repository's [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example) file.
+ The example file contains some of the [OpenRAG environment variables](/reference/configuration) to get you started with configuring your deployment.
+
```bash
cp .env.example .env
```
- Alternatively, create a new `.env` file in the repository root.
- ```
- touch .env
- ```
+5. Edit the `.env` file to configure your deployment using [OpenRAG environment variables](/reference/configuration).
+The OpenRAG Docker Compose files pull values from your `.env` file to configure the OpenRAG containers.
+The following variables are required or recommended:
-4. The Docker Compose files are populated with the values from your `.env` file.
- The `OPENSEARCH_PASSWORD` value must be set.
- `OPENSEARCH_PASSWORD` can be automatically generated when using the TUI, but for a Docker Compose installation, you can set it manually instead. To generate an OpenSearch admin password, see the [OpenSearch documentation](https://docs.opensearch.org/latest/security/configuration/demo-configuration/#setting-up-a-custom-admin-password).
+ * **`OPENSEARCH_PASSWORD` (Required)**: Sets the OpenSearch administrator password. It must adhere to the [OpenSearch password complexity requirements](https://docs.opensearch.org/latest/security/configuration/demo-configuration/#setting-up-a-custom-admin-password).
- The following values are optional:
+ * **`LANGFLOW_SUPERUSER`**: The username for the Langflow administrator user. Defaults to `admin` if not set.
- ```env
- OPENAI_API_KEY=your_openai_api_key
- LANGFLOW_SECRET_KEY=your_secret_key
- ```
+ * **`LANGFLOW_SUPERUSER_PASSWORD` (Strongly recommended)**: Sets the Langflow administrator password, and determines the Langflow server's default authentication mode. If not set, the Langflow server starts without authentication enabled. For more information, see [Langflow settings](/reference/configuration#langflow-settings).
- `OPENAI_API_KEY` is optional. You can provide it during [application onboarding](#application-onboarding) or choose a different model provider. If you want to set it in your `.env` file, you can find your OpenAI API key in your [OpenAI account](https://platform.openai.com/api-keys).
+ * **`LANGFLOW_SECRET_KEY` (Strongly recommended)**: A secret encryption key for internal Langflow operations. It is recommended to [generate your own Langflow secret key](https://docs.langflow.org/api-keys-and-authentication#langflow-secret-key). If not set, Langflow generates a secret key automatically.
- `LANGFLOW_SECRET_KEY` is optional. Langflow will auto-generate it if not set. For more information, see the [Langflow documentation](https://docs.langflow.org/api-keys-and-authentication#langflow-secret-key).
+ * **Model provider credentials**: Provide credentials for your preferred model providers. If not set in the `.env` file, you must configure at least one provider during [application onboarding](#application-onboarding).
- The following Langflow configuration values are optional but important to consider:
+ * `OPENAI_API_KEY`
+ * `ANTHROPIC_API_KEY`
+ * `OLLAMA_ENDPOINT`
+ * `WATSONX_API_KEY`
+ * `WATSONX_ENDPOINT`
+ * `WATSONX_PROJECT_ID`
- ```env
- LANGFLOW_SUPERUSER=admin
- LANGFLOW_SUPERUSER_PASSWORD=your_langflow_password
- ```
+ * **OAuth provider credentials**: To upload documents from external storage, such as Google Drive, set the required OAuth credentials for the connectors that you want to use. You can [manage OAuth credentials](/ingestion#oauth-ingestion) later, but it is recommended to configure them during initial set up so you don't have to rebuild the containers.
- `LANGFLOW_SUPERUSER` defaults to `admin`. You can omit it or set it to a different username. `LANGFLOW_SUPERUSER_PASSWORD` is optional. If omitted, Langflow runs in [autologin mode](https://docs.langflow.org/api-keys-and-authentication#langflow-auto-login) with no password required. If set, Langflow requires password authentication.
-
- For more information on configuring OpenRAG with environment variables, see [Environment variables](/reference/configuration).
+ * **Amazon**: Provide your AWS Access Key ID and AWS Secret Access Key with access to your S3 instance. For more information, see the AWS documentation on [Configuring access to AWS applications](https://docs.aws.amazon.com/singlesignon/latest/userguide/manage-your-applications.html).
+ * **Google**: Provide your Google OAuth Client ID and Google OAuth Client Secret. You can generate these in the [Google Cloud Console](https://console.cloud.google.com/apis/credentials). For more information, see the [Google OAuth client documentation](https://developers.google.com/identity/protocols/oauth2).
+ * **Microsoft**: For the Microsoft OAuth Client ID and Microsoft OAuth Client Secret, provide [Azure application registration credentials for SharePoint and OneDrive](https://learn.microsoft.com/en-us/onedrive/developer/rest-api/getting-started/app-registration?view=odsp-graph-online). For more information, see the [Microsoft Graph OAuth client documentation](https://learn.microsoft.com/en-us/onedrive/developer/rest-api/getting-started/graph-oauth).
-5. Start `docling serve` on the host machine.
+ For more information and variables, see [Environment variables](/reference/configuration).
+
+6. Start `docling serve` on the host machine.
OpenRAG Docker installations require that `docling serve` is running on port 5001 on the host machine.
This enables [Mac MLX](https://opensource.apple.com/projects/mlx/) support for document processing.
-
+
```bash
uv run python scripts/docling_ctl.py start --port 5001
```
-
-6. Confirm `docling serve` is running.
+
+7. Confirm `docling serve` is running.
```
uv run python scripts/docling_ctl.py status
```
@@ -99,7 +105,7 @@ To install OpenRAG with Docker Compose, do the following:
PID: 27746
```
-7. Deploy OpenRAG locally with the appropriate Docker Compose file for your environment.
+8. Deploy OpenRAG locally with the appropriate Docker Compose file for your environment.
Both files deploy the same services.
* [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment with GPU support for accelerated AI processing. This Docker Compose file requires an NVIDIA GPU with [CUDA](https://docs.nvidia.com/cuda/) support.
@@ -142,7 +148,7 @@ Both files deploy the same services.
| OpenSearch | http://localhost:9200 | Datastore for [knowledge](/knowledge). |
| OpenSearch Dashboards | http://localhost:5601 | OpenSearch database administration interface. |
-8. Wait while the containers start, and then confirm all containers are running:
+9. Wait while the containers start, and then confirm all containers are running:
* Docker Compose:
@@ -158,7 +164,7 @@ Both files deploy the same services.
If all containers are running, you can access your OpenRAG services at their addresses.
-9. Access the OpenRAG frontend at `http://localhost:3000` to continue with [application onboarding](#application-onboarding).
+10. Access the OpenRAG frontend at `http://localhost:3000` to continue with [application onboarding](#application-onboarding).
diff --git a/docs/docs/get-started/install-options.mdx b/docs/docs/get-started/install-options.mdx
index cc820e6d..db72673a 100644
--- a/docs/docs/get-started/install-options.mdx
+++ b/docs/docs/get-started/install-options.mdx
@@ -1,12 +1,12 @@
---
-title: Choose an installation method
+title: Select an installation method
slug: /install-options
---
The [OpenRAG architecture](/#openrag-architecture) is lightweight and container-based with a central OpenRAG backend that orchestrates the various services and external connectors.
Depending on your use case, OpenRAG can assist with service management, or you can manage the services yourself.
-Choose the installation method that best fits your needs:
+Select the installation method that best fits your needs:
* **Use the [Terminal User Interface (TUI)](/tui) to manage services**: For guided configuration and simplified service management, install OpenRAG with TUI-managed services.
@@ -23,9 +23,9 @@ Choose the installation method that best fits your needs:
The first time you start OpenRAG, you must complete application onboarding.
This is required for all installation methods because it prepares the minimum required configuration for OpenRAG to run.
For TUI-managed services, you must also complete initial setup before you start the OpenRAG services.
-For more information, see the instructions for your chosen installation method.
+For more information, see the instructions for your preferred installation method.
Your OpenRAG configuration is stored in a `.env` file in the OpenRAG installation directory.
When using TUI-managed services, the TUI prompts you for any missing values during setup and onboarding, and any values detected in a preexisting `.env` file are automatically populated.
When using self-managed services, you must predefine these values in a `.env` file, as you would for any Docker or Podman deployment.
-For more information, see the instructions for your chosen installation method and [Environment variables](/reference/configuration).
\ No newline at end of file
+For more information, see the instructions for your preferred installation method and [Environment variables](/reference/configuration).
\ No newline at end of file
diff --git a/docs/docs/get-started/install-uv.mdx b/docs/docs/get-started/install-uv.mdx
index 0fabb6a6..5c94da6d 100644
--- a/docs/docs/get-started/install-uv.mdx
+++ b/docs/docs/get-started/install-uv.mdx
@@ -17,7 +17,7 @@ For guided configuration and simplified service management, install OpenRAG with
You can use [`uv`](https://docs.astral.sh/uv/getting-started/installation/) to install OpenRAG as a managed or unmanaged dependency in a new or existing Python project.
-For other installation methods, see [Choose an installation method](/install-options).
+For other installation methods, see [Select an installation method](/install-options).
## Prerequisites
diff --git a/docs/docs/get-started/install-uvx.mdx b/docs/docs/get-started/install-uvx.mdx
index 4115684a..05a9c563 100644
--- a/docs/docs/get-started/install-uvx.mdx
+++ b/docs/docs/get-started/install-uvx.mdx
@@ -22,7 +22,7 @@ The [automatic installer script](/install) also uses `uvx` to install OpenRAG.
:::
This installation method is best for testing OpenRAG by running it outside of a Python project.
-For other installation methods, see [Choose an installation method](/install-options).
+For other installation methods, see [Select an installation method](/install-options).
## Prerequisites
diff --git a/docs/docs/get-started/install.mdx b/docs/docs/get-started/install.mdx
index cf192516..c012394e 100644
--- a/docs/docs/get-started/install.mdx
+++ b/docs/docs/get-started/install.mdx
@@ -21,7 +21,7 @@ For guided configuration and simplified service management, install OpenRAG with
The installer script installs `uv`, Docker or Podman, Docker Compose, and OpenRAG.
This installation method is best for testing OpenRAG by running it outside of a Python project.
-For other installation methods, see [Choose an installation method](/install-options).
+For other installation methods, see [Select an installation method](/install-options).
## Prerequisites
diff --git a/docs/docs/get-started/quickstart.mdx b/docs/docs/get-started/quickstart.mdx
index b2812faf..0c34da85 100644
--- a/docs/docs/get-started/quickstart.mdx
+++ b/docs/docs/get-started/quickstart.mdx
@@ -16,7 +16,7 @@ Use this quickstart to install OpenRAG, and then try some of OpenRAG's core feat
-- Get an [OpenAI API key](https://platform.openai.com/api-keys).
+* Get an [OpenAI API key](https://platform.openai.com/api-keys).
This quickstart uses OpenAI for simplicity.
For other providers, see the other [installation methods](/install-options).
diff --git a/docs/docs/reference/configuration.mdx b/docs/docs/reference/configuration.mdx
index 7b62bd36..8221786d 100644
--- a/docs/docs/reference/configuration.mdx
+++ b/docs/docs/reference/configuration.mdx
@@ -90,36 +90,42 @@ Control how OpenRAG [processes and ingests documents](/ingestion) into your know
| `OPENRAG_DOCUMENTS_PATHS` | `./openrag-documents` | Document paths for ingestion. |
| `PICTURE_DESCRIPTIONS_ENABLED` | `false` | Enable picture descriptions. |
-### Langflow settings
+### Langflow settings {#langflow-settings}
-Configure Langflow authentication.
+Configure the OpenRAG Langflow server's authentication, contact point, and built-in flow definitions.
+
+:::info
+The `LANGFLOW_SUPERUSER_PASSWORD` is set in your `.env` file, and this value determines the default values for several other Langflow authentication variables.
+
+If the `LANGFLOW_SUPERUSER_PASSWORD` variable isn't set, then the Langflow server starts _without_ authentication enabled.
+
+For better security, it is recommended to set `LANGFLOW_SUPERUSER_PASSWORD` so the [Langflow server starts with authentication enabled](https://docs.langflow.org/api-keys-and-authentication#start-a-langflow-server-with-authentication-enabled).
+:::
| Variable | Default | Description |
|----------|---------|-------------|
-| `LANGFLOW_AUTO_LOGIN` | `False` | Enable auto-login for Langflow. |
-| `LANGFLOW_CHAT_FLOW_ID` | Built-in flow ID | This value is automatically set to the ID of the chat [flow](/agents). The default value is found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change this value if you explicitly don't want to use this built-in flow. |
-| `LANGFLOW_ENABLE_SUPERUSER_CLI` | `False` | Enable superuser privileges for Langflow CLI commands. |
-| `LANGFLOW_INGEST_FLOW_ID` | Built-in flow ID | This value is automatically set to the ID of the ingestion [flow](/agents). The default value is found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change this value if you explicitly don't want to use this built-in flow. |
-| `LANGFLOW_KEY` | Automatically generated | Explicit Langflow API key. |
-| `LANGFLOW_NEW_USER_IS_ACTIVE` | `False` | Whether new Langflow users are active by default. |
-| `LANGFLOW_PUBLIC_URL` | `http://localhost:7860` | Public URL for the Langflow instance. |
-| `LANGFLOW_SECRET_KEY` | Not set | Secret key for Langflow internal operations. |
-| `LANGFLOW_SUPERUSER` | None, must be explicitly set | Langflow admin username. Required. |
-| `LANGFLOW_SUPERUSER_PASSWORD` | None, must be explicitly set | Langflow admin password. Required. |
+| `LANGFLOW_AUTO_LOGIN` | Determined by `LANGFLOW_SUPERUSER_PASSWORD` | Whether to enable [auto-login mode](https://docs.langflow.org/api-keys-and-authentication#langflow-auto-login) for the Langflow visual editor and CLI. If `LANGFLOW_SUPERUSER_PASSWORD` isn't set, then `LANGFLOW_AUTO_LOGIN` is `True` and auto-login mode is enabled. If `LANGFLOW_SUPERUSER_PASSWORD` is set, then `LANGFLOW_AUTO_LOGIN` is `False` and auto-login mode is disabled. Langflow API calls always require authentication with a Langflow API key regardless of the auto-login setting. |
+| `LANGFLOW_ENABLE_SUPERUSER_CLI` | Determined by `LANGFLOW_SUPERUSER_PASSWORD` | Whether to enable the [Langflow CLI `langflow superuser` command](https://docs.langflow.org/api-keys-and-authentication#langflow-enable-superuser-cli). If `LANGFLOW_SUPERUSER_PASSWORD` isn't set, then `LANGFLOW_ENABLE_SUPERUSER_CLI` is `True` and superuser accounts can be created with the Langflow CLI. If `LANGFLOW_SUPERUSER_PASSWORD` is set, then `LANGFLOW_ENABLE_SUPERUSER_CLI` is `False` and the `langflow superuser` command is disabled. |
+| `LANGFLOW_NEW_USER_IS_ACTIVE` | Determined by `LANGFLOW_SUPERUSER_PASSWORD` | Whether new [Langflow user accounts are active by default](https://docs.langflow.org/api-keys-and-authentication#langflow-new-user-is-active). If `LANGFLOW_SUPERUSER_PASSWORD` isn't set, then `LANGFLOW_NEW_USER_IS_ACTIVE` is `True` and new user accounts are active by default. If `LANGFLOW_SUPERUSER_PASSWORD` is set, then `LANGFLOW_NEW_USER_IS_ACTIVE` is `False` and new user accounts are inactive by default. |
+| `LANGFLOW_PUBLIC_URL` | `http://localhost:7860` | Public URL for the Langflow instance. Forms the base URL for Langflow API calls and other interfaces with your OpenRAG Langflow instance. |
+| `LANGFLOW_KEY` | Automatically generated | A Langflow API key to run flows with Langflow API calls. Because Langflow API keys are server-specific, allow OpenRAG to generate this key initially. You can create additional Langflow API keys after deploying OpenRAG. |
+| `LANGFLOW_SECRET_KEY` | Automatically generated | Secret encryption key for Langflow internal operations. It is recommended to [generate your own Langflow secret key](https://docs.langflow.org/api-keys-and-authentication#langflow-secret-key) for this variable. If not set, Langflow generates a secret key automatically. |
+| `LANGFLOW_SUPERUSER` | `admin` | Username for the Langflow administrator user. |
+| `LANGFLOW_SUPERUSER_PASSWORD` | Not set | Langflow administrator password. If not set, the Langflow server starts _without_ authentication enabled. It is recommended to set `LANGFLOW_SUPERUSER_PASSWORD` so the [Langflow server starts with authentication enabled](https://docs.langflow.org/api-keys-and-authentication#start-a-langflow-server-with-authentication-enabled). |
| `LANGFLOW_URL` | `http://localhost:7860` | URL for the Langflow instance. |
-| `NUDGES_FLOW_ID` | Built-in flow ID | This value is automatically set to the ID of the nudges [flow](/agents). The default value is found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change this value if you explicitly don't want to use this built-in flow. |
+| `LANGFLOW_CHAT_FLOW_ID`, `LANGFLOW_INGEST_FLOW_ID`, `NUDGES_FLOW_ID` | Built-in flow IDs | These variables are set automatically to the IDs of the chat, ingestion, and nudges [flows](/agents). The default values are found in [`.env.example`](https://github.com/langflow-ai/openrag/blob/main/.env.example). Only change these values if you want to replace a built-in flow with your own custom flow. The flow JSON must be present in your version of the OpenRAG codebase. For example, if you [deploy self-managed services](/docker), you can add the flow JSON to your local clone of the OpenRAG repository before deploying OpenRAG. |
| `SYSTEM_PROMPT` | `You are a helpful AI assistant with access to a knowledge base. Answer questions based on the provided context.` | System prompt instructions for the agent driving the **Chat** flow. |
### OAuth provider settings
-Configure OAuth providers and external service integrations.
+Configure [OAuth providers](/ingestion#oauth-ingestion) and external service integrations.
| Variable | Default | Description |
|----------|---------|-------------|
-| `AWS_ACCESS_KEY_ID` / `AWS_SECRET_ACCESS_KEY` | - | AWS integrations. |
-| `GOOGLE_OAUTH_CLIENT_ID` / `GOOGLE_OAUTH_CLIENT_SECRET` | - | Google OAuth authentication. |
-| `MICROSOFT_GRAPH_OAUTH_CLIENT_ID` / `MICROSOFT_GRAPH_OAUTH_CLIENT_SECRET` | - | Microsoft OAuth. |
-| `WEBHOOK_BASE_URL` | - | Base URL for webhook endpoints. |
+| `AWS_ACCESS_KEY_ID`
`AWS_SECRET_ACCESS_KEY` | Not set | Enable access to AWS S3 with an [AWS OAuth app](https://docs.aws.amazon.com/singlesignon/latest/userguide/manage-your-applications.html) integration. |
+| `GOOGLE_OAUTH_CLIENT_ID`
`GOOGLE_OAUTH_CLIENT_SECRET` | Not set | Enable the [Google OAuth client](https://developers.google.com/identity/protocols/oauth2) integration. You can generate these values in the [Google Cloud Console](https://console.cloud.google.com/apis/credentials). |
+| `MICROSOFT_GRAPH_OAUTH_CLIENT_ID`
`MICROSOFT_GRAPH_OAUTH_CLIENT_SECRET` | Not set | Enable the [Microsoft Graph OAuth client](https://learn.microsoft.com/en-us/onedrive/developer/rest-api/getting-started/graph-oauth) integration by providing [Azure application registration credentials for SharePoint and OneDrive](https://learn.microsoft.com/en-us/onedrive/developer/rest-api/getting-started/app-registration?view=odsp-graph-online). |
+| `WEBHOOK_BASE_URL` | Not set | Base URL for OAuth connector webhook endpoints. If not set, a default base URL is used. |
### OpenSearch settings
@@ -127,10 +133,10 @@ Configure OpenSearch database authentication.
| Variable | Default | Description |
|----------|---------|-------------|
-| `OPENSEARCH_HOST` | `localhost` | OpenSearch host. |
-| `OPENSEARCH_PASSWORD` | - | Password for OpenSearch admin user. Required. |
-| `OPENSEARCH_PORT` | `9200` | OpenSearch port. |
-| `OPENSEARCH_USERNAME` | `admin` | OpenSearch username. |
+| `OPENSEARCH_HOST` | `localhost` | OpenSearch instance host. |
+| `OPENSEARCH_PORT` | `9200` | OpenSearch instance port. |
+| `OPENSEARCH_USERNAME` | `admin` | OpenSearch administrator username. |
+| `OPENSEARCH_PASSWORD` | Must be set at start up | Required. OpenSearch administrator password. Must adhere to the [OpenSearch password complexity requirements](https://docs.opensearch.org/latest/security/configuration/demo-configuration/#setting-up-a-custom-admin-password). You must set this directly in the `.env` or in the TUI's [**Basic/Advanced Setup**(/install#setup)]. |
### System settings
@@ -141,8 +147,8 @@ Configure general system components, session management, and logging.
| `LANGFLOW_KEY_RETRIES` | `15` | Number of retries for Langflow key generation. |
| `LANGFLOW_KEY_RETRY_DELAY` | `2.0` | Delay between retries in seconds. |
| `LANGFLOW_VERSION` | `OPENRAG_VERSION` | Langflow Docker image version. By default, OpenRAG uses the `OPENRAG_VERSION` for the Langflow Docker image version. |
-| `LOG_FORMAT` | Disabled | Set to `json` to enabled JSON-formatted log output. |
-| `LOG_LEVEL` | `INFO` | Logging level (DEBUG, INFO, WARNING, ERROR). |
+| `LOG_FORMAT` | Not set | Set to `json` to enabled JSON-formatted log output. If not set, the default format is used. |
+| `LOG_LEVEL` | `INFO` | Logging level. Can be one of `DEBUG`, `INFO`, `WARNING`, or `ERROR`. `DEBUG` provides the most detailed logs but can impact performance. |
| `MAX_WORKERS` | `1` | Maximum number of workers for document processing. |
| `OPENRAG_VERSION` | `latest` | The version of the OpenRAG Docker images to run. For more information, see [Upgrade OpenRAG](/upgrade) |
| `SERVICE_NAME` | `openrag` | Service name for logging. |