From e865d0ffd01c80b34c1c679f7d526353702b6394 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 11:26:18 -0400 Subject: [PATCH 01/13] clean-links-and-add-guidance --- README.md | 87 ++++++++++++++++++++++--------------------------------- 1 file changed, 34 insertions(+), 53 deletions(-) diff --git a/README.md b/README.md index a0178f28..d15328d3 100644 --- a/README.md +++ b/README.md @@ -4,15 +4,14 @@
- 🚀 Quick Start   |   - 💻 TUI Interface   |   - 🐳 Docker Deployment   |   - ⚙️ Development   |   - 🔧 Troubleshooting + Quick Start   |   + TUI Interface   |   + Docker Deployment   |   + Development   |   + Troubleshooting
- OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. [![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/phact/openrag) @@ -27,12 +26,7 @@ OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables - - - - - -## 🚀 Quick Start +## Quick Start ### Prerequisites @@ -62,7 +56,8 @@ LANGFLOW_CHAT_FLOW_ID=your_chat_flow_id LANGFLOW_INGEST_FLOW_ID=your_ingest_flow_id NUDGES_FLOW_ID=your_nudges_flow_id ``` -See extended configuration, including ingestion and optional variables: [docs/reference/configuration.md](docs/docs/reference/configuration.md) +See extended configuration, including ingestion and optional variables: [docs/reference/configuration.mdx](docs/docs/reference/configuration.mdx) + ### 3. Start OpenRAG ```bash @@ -80,13 +75,17 @@ Access the services: - **OpenSearch**: http://localhost:9200 - **OpenSearch Dashboards**: http://localhost:5601 -## 🖥️ TUI Interface +With OpenRAG started, ingest and retrieve documents with the [OpenRAG Quickstart](/docs/get-started/quickstart.mdx). + +## TUI interface OpenRAG includes a powerful Terminal User Interface (TUI) for easy setup, configuration, and monitoring. The TUI provides a user-friendly way to manage your OpenRAG installation without complex command-line operations. ![OpenRAG TUI Interface](assets/OpenRAG_TUI_2025-09-10T13_04_11_757637.svg) -### Launching the TUI +### Launch OpenRAG with the TUI + +From the repository root, run: ```bash # Install dependencies first @@ -96,66 +95,48 @@ uv sync uv run openrag ``` -### TUI Features +For the full TUI guide, see [docs/get-started/tui.mdx](docs/docs/get-started/tui.mdx) -See the full TUI guide for features, navigation, and benefits: [docs/get-started/tui.mdx](docs/docs/get-started/tui.mdx) +## Docker Deployment +The repository includes two Docker Compose files. +They deploy the same applications and containers, but to different environments. +- [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment with GPU support for accelerated AI processing. +- [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available. -## 🐳 Docker Deployment - -### Standard Deployment - +1. Clone the OpenRAG repository. +```bash +git clone https://github.com/langflow-ai/openrag.git +cd openrag +``` + +2. Build and start all services. + +For the GPU-accelerated deployment, run: ```bash -# Build and start all services docker compose build docker compose up -d ``` -### CPU-Only Deployment - -For environments without GPU support: - +For environments without GPU support, run: ```bash docker compose -f docker-compose-cpu.yml up -d ``` -More deployment commands and tips: [docs/get-started/docker.mdx](docs/docs/get-started/docker.mdx) +For more information, see [docs/get-started/docker.mdx](docs/docs/get-started/docker.mdx) -## 🔧 Troubleshooting +## Troubleshooting -### Podman on macOS +For common issues and fixes, see [docs/support/troubleshoot.mdx](docs/docs/support/troubleshoot.mdx). -If using Podman on macOS, you may need to increase VM memory: - -```bash -podman machine stop -podman machine rm -podman machine init --memory 8192 # 8 GB example -podman machine start -``` - -### Common Issues - -See common issues and fixes: [docs/support/troubleshoot.mdx](docs/docs/reference/troubleshoot.mdx) - - - -## 🛠️ Development +## Development For developers wanting to contribute to OpenRAG or set up a development environment, please see our comprehensive development guide: **[📚 See CONTRIBUTING.md for detailed development instructions](CONTRIBUTING.md)** -The contributing guide includes: -- Complete development environment setup -- Local development workflows -- Testing and debugging procedures -- Code style guidelines -- Architecture overview -- Pull request guidelines - ### Quick Development Commands ```bash From 5a567a0b95e1d160951c993e1b9e052f0acdd1fa Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 11:29:00 -0400 Subject: [PATCH 02/13] add-release-badge --- README.md | 27 ++++++++++++++------------- 1 file changed, 14 insertions(+), 13 deletions(-) diff --git a/README.md b/README.md index d15328d3..4487e522 100644 --- a/README.md +++ b/README.md @@ -2,20 +2,9 @@ # OpenRAG - -
- Quick Start   |   - TUI Interface   |   - Docker Deployment   |   - Development   |   - Troubleshooting -
- - -OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. [![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/phact/openrag) - -
+ Release +    Langflow    OpenSearch @@ -23,9 +12,21 @@ OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables Starlette    Next.js +    + Ask DeepWiki
+OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. + +
+ Quick Start   |   + TUI Interface   |   + Docker Deployment   |   + Development   |   + Troubleshooting +
+ ## Quick Start ### Prerequisites From 1769095581131a83ac777311aefa092078860f1a Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 11:33:14 -0400 Subject: [PATCH 03/13] include-prerelease --- README.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 4487e522..e7b1308f 100644 --- a/README.md +++ b/README.md @@ -3,8 +3,6 @@ # OpenRAG
- Release -    Langflow    OpenSearch @@ -14,7 +12,8 @@ Next.js    Ask DeepWiki - + Release +   
OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. From ebde493571dabb88386942711ae233ef9a15ebdb Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 11:33:50 -0400 Subject: [PATCH 04/13] remove-release-badge --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index e7b1308f..e4f90d3d 100644 --- a/README.md +++ b/README.md @@ -12,8 +12,6 @@ Next.js    Ask DeepWiki - Release -    OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. From 52a48949fb7ecb3139a442cd66139f1253d8c105 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 11:44:55 -0400 Subject: [PATCH 05/13] changes-for-clarity --- README.md | 30 ++++++++++++++++-------------- 1 file changed, 16 insertions(+), 14 deletions(-) diff --git a/README.md b/README.md index e4f90d3d..3d15b53c 100644 --- a/README.md +++ b/README.md @@ -17,21 +17,24 @@ OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration.
- Quick Start   |   + Quickstart   |   TUI Interface   |   Docker Deployment   |   Development   |   Troubleshooting
-## Quick Start +## Quickstart + ### Prerequisites - Docker or Podman with Compose installed - Make (for development commands) -### 1. Environment Setup +### Install and start OpenRAG + +1. Set up development environment. ```bash # Clone and setup environment @@ -40,9 +43,7 @@ cd openrag make setup # Creates .env and installs dependencies ``` -### 2. Configure Environment - -Edit `.env` with your API keys and credentials: +2. Configure the `.env` file with your API keys and credentials. ```bash # Required @@ -54,9 +55,10 @@ LANGFLOW_CHAT_FLOW_ID=your_chat_flow_id LANGFLOW_INGEST_FLOW_ID=your_ingest_flow_id NUDGES_FLOW_ID=your_nudges_flow_id ``` -See extended configuration, including ingestion and optional variables: [docs/reference/configuration.mdx](docs/docs/reference/configuration.mdx) -### 3. Start OpenRAG +For extended configuration, including ingestion and optional variables, see [docs/reference/configuration.mdx](docs/docs/reference/configuration.mdx) + +3. Start OpenRAG. ```bash # Full stack with GPU support @@ -73,7 +75,7 @@ Access the services: - **OpenSearch**: http://localhost:9200 - **OpenSearch Dashboards**: http://localhost:5601 -With OpenRAG started, ingest and retrieve documents with the [OpenRAG Quickstart](/docs/get-started/quickstart.mdx). +With OpenRAG started, ingest and retrieve documents with the [OpenRAG Quickstart](docs/docs/get-started/quickstart.mdx). ## TUI interface @@ -93,14 +95,14 @@ uv sync uv run openrag ``` -For the full TUI guide, see [docs/get-started/tui.mdx](docs/docs/get-started/tui.mdx) +For the full TUI guide, see [TUI](docs/docs/get-started/tui.mdx). ## Docker Deployment -The repository includes two Docker Compose files. +The repository includes two Docker Compose `.yml` files. They deploy the same applications and containers, but to different environments. -- [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment with GPU support for accelerated AI processing. +- [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment for environments with GPU support. GPU support requires an NVIDIA GPU with CUDA support and compatible NVIDIA drivers installed on the OpenRAG host machine. - [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available. @@ -123,11 +125,11 @@ For environments without GPU support, run: docker compose -f docker-compose-cpu.yml up -d ``` -For more information, see [docs/get-started/docker.mdx](docs/docs/get-started/docker.mdx) +For more information, see [Deploy with Docker](docs/docs/get-started/docker.mdx). ## Troubleshooting -For common issues and fixes, see [docs/support/troubleshoot.mdx](docs/docs/support/troubleshoot.mdx). +For common issues and fixes, see [Troubleshoot](docs/docs/support/troubleshoot.mdx). ## Development From 4720dc6d4affb7ebc287f89401eb694f2263ee30 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 11:59:36 -0400 Subject: [PATCH 06/13] tui-quickstart --- CONTRIBUTING.md | 42 ++++++++++++++++++++++----- README.md | 77 ++++++------------------------------------------- 2 files changed, 44 insertions(+), 75 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 19b01709..6b8cd832 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -11,20 +11,48 @@ Thank you for your interest in contributing to OpenRAG! This guide will help you - Python 3.13+ with uv package manager - Node.js 18+ and npm -### Environment Setup +### Set up OpenRAG for development + +1. Set up your development environment. ```bash -# Clone the repository -git clone +# Clone and setup environment +git clone https://github.com/langflow-ai/openrag.git cd openrag - -# Setup development environment make setup # Creates .env and installs dependencies ``` -### Configuration +2. Configure the `.env` file with your API keys and credentials. -Edit `.env` with your API keys and credentials. See the main README for required environment variables. +```bash +# Required +OPENAI_API_KEY=your_openai_api_key +OPENSEARCH_PASSWORD=your_secure_password +LANGFLOW_SUPERUSER=admin +LANGFLOW_SUPERUSER_PASSWORD=your_secure_password +LANGFLOW_CHAT_FLOW_ID=your_chat_flow_id +LANGFLOW_INGEST_FLOW_ID=your_ingest_flow_id +NUDGES_FLOW_ID=your_nudges_flow_id +``` + +For extended configuration, including ingestion and optional variables, see [docs/reference/configuration.mdx](docs/docs/reference/configuration.mdx). + +3. Start OpenRAG. + +```bash +# Full stack with GPU support +make dev + +# Or CPU only +make dev-cpu +``` + +Access the services: +- **Frontend**: http://localhost:3000 +- **Backend API**: http://localhost:8000 +- **Langflow**: http://localhost:7860 +- **OpenSearch**: http://localhost:9200 +- **OpenSearch Dashboards**: http://localhost:5601 ## 🔧 Development Commands diff --git a/README.md b/README.md index 3d15b53c..4c06e4a9 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,8 @@ Ask DeepWiki -OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. +OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration. +
Quickstart   |   @@ -26,67 +27,17 @@ OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables ## Quickstart +To get started quickly, use the OpenRAG Terminal User Interface (TUI) to manage your OpenRAG installation without complex command-line operations. -### Prerequisites - -- Docker or Podman with Compose installed -- Make (for development commands) - -### Install and start OpenRAG - -1. Set up development environment. +To launch OpenRAG with the TUI, do the following: +1. Clone the OpenRAG repository. ```bash -# Clone and setup environment git clone https://github.com/langflow-ai/openrag.git cd openrag -make setup # Creates .env and installs dependencies ``` -2. Configure the `.env` file with your API keys and credentials. - -```bash -# Required -OPENAI_API_KEY=your_openai_api_key -OPENSEARCH_PASSWORD=your_secure_password -LANGFLOW_SUPERUSER=admin -LANGFLOW_SUPERUSER_PASSWORD=your_secure_password -LANGFLOW_CHAT_FLOW_ID=your_chat_flow_id -LANGFLOW_INGEST_FLOW_ID=your_ingest_flow_id -NUDGES_FLOW_ID=your_nudges_flow_id -``` - -For extended configuration, including ingestion and optional variables, see [docs/reference/configuration.mdx](docs/docs/reference/configuration.mdx) - -3. Start OpenRAG. - -```bash -# Full stack with GPU support -make dev - -# Or CPU only -make dev-cpu -``` - -Access the services: -- **Frontend**: http://localhost:3000 -- **Backend API**: http://localhost:8000 -- **Langflow**: http://localhost:7860 -- **OpenSearch**: http://localhost:9200 -- **OpenSearch Dashboards**: http://localhost:5601 - -With OpenRAG started, ingest and retrieve documents with the [OpenRAG Quickstart](docs/docs/get-started/quickstart.mdx). - -## TUI interface - -OpenRAG includes a powerful Terminal User Interface (TUI) for easy setup, configuration, and monitoring. The TUI provides a user-friendly way to manage your OpenRAG installation without complex command-line operations. - -![OpenRAG TUI Interface](assets/OpenRAG_TUI_2025-09-10T13_04_11_757637.svg) - -### Launch OpenRAG with the TUI - -From the repository root, run: - +2. To start the TUI, from the repository root, run: ```bash # Install dependencies first uv sync @@ -95,6 +46,8 @@ uv sync uv run openrag ``` +The TUI opens and guides you through OpenRAG setup. + For the full TUI guide, see [TUI](docs/docs/get-started/tui.mdx). ## Docker Deployment @@ -133,16 +86,4 @@ For common issues and fixes, see [Troubleshoot](docs/docs/support/troubleshoot.m ## Development -For developers wanting to contribute to OpenRAG or set up a development environment, please see our comprehensive development guide: - -**[📚 See CONTRIBUTING.md for detailed development instructions](CONTRIBUTING.md)** - -### Quick Development Commands - -```bash -make help # See all available commands -make setup # Initial development setup -make infra # Start infrastructure services -make backend # Run backend locally -make frontend # Run frontend locally -``` \ No newline at end of file +For developers wanting to contribute to OpenRAG or set up a development environment, see [CONTRIBUTING.md](CONTRIBUTING.md). \ No newline at end of file From 19c86c8b72e4ef0701e1f6c8b9114dd65148bf30 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 12:02:46 -0400 Subject: [PATCH 07/13] style --- README.md | 20 ++++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/README.md b/README.md index 4c06e4a9..b4beef2a 100644 --- a/README.md +++ b/README.md @@ -52,7 +52,7 @@ For the full TUI guide, see [TUI](docs/docs/get-started/tui.mdx). ## Docker Deployment -The repository includes two Docker Compose `.yml` files. +If you prefer to use Docker to run OpenRAG, the repository includes two Docker Compose `.yml` files. They deploy the same applications and containers, but to different environments. - [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment for environments with GPU support. GPU support requires an NVIDIA GPU with CUDA support and compatible NVIDIA drivers installed on the OpenRAG host machine. @@ -67,16 +67,16 @@ cd openrag 2. Build and start all services. -For the GPU-accelerated deployment, run: -```bash -docker compose build -docker compose up -d -``` + For the GPU-accelerated deployment, run: + ```bash + docker compose build + docker compose up -d + ``` -For environments without GPU support, run: -```bash -docker compose -f docker-compose-cpu.yml up -d -``` + For environments without GPU support, run: + ```bash + docker compose -f docker-compose-cpu.yml up -d + ``` For more information, see [Deploy with Docker](docs/docs/get-started/docker.mdx). From 1b45813dfdbb16bf2dab0fb71a17396c77b32848 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 12:03:56 -0400 Subject: [PATCH 08/13] restating-headline --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index b4beef2a..b4838cd8 100644 --- a/README.md +++ b/README.md @@ -27,7 +27,7 @@ OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables ## Quickstart -To get started quickly, use the OpenRAG Terminal User Interface (TUI) to manage your OpenRAG installation without complex command-line operations. +Use the OpenRAG Terminal User Interface (TUI) to manage your OpenRAG installation without complex command-line operations. To launch OpenRAG with the TUI, do the following: From e88601c05aa72f58d223fbea4b7dbf21788eff55 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 12:05:54 -0400 Subject: [PATCH 09/13] spacing --- README.md | 32 ++++++++++++++++---------------- 1 file changed, 16 insertions(+), 16 deletions(-) diff --git a/README.md b/README.md index b4838cd8..bfccdb6a 100644 --- a/README.md +++ b/README.md @@ -32,21 +32,21 @@ Use the OpenRAG Terminal User Interface (TUI) to manage your OpenRAG installatio To launch OpenRAG with the TUI, do the following: 1. Clone the OpenRAG repository. -```bash -git clone https://github.com/langflow-ai/openrag.git -cd openrag -``` + ```bash + git clone https://github.com/langflow-ai/openrag.git + cd openrag + ``` 2. To start the TUI, from the repository root, run: -```bash -# Install dependencies first -uv sync + ```bash + # Install dependencies first + uv sync + + # Launch the TUI + uv run openrag + ``` -# Launch the TUI -uv run openrag -``` - -The TUI opens and guides you through OpenRAG setup. + The TUI opens and guides you through OpenRAG setup. For the full TUI guide, see [TUI](docs/docs/get-started/tui.mdx). @@ -60,10 +60,10 @@ They deploy the same applications and containers, but to different environments. - [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available. 1. Clone the OpenRAG repository. -```bash -git clone https://github.com/langflow-ai/openrag.git -cd openrag -``` + ```bash + git clone https://github.com/langflow-ai/openrag.git + cd openrag + ``` 2. Build and start all services. From c5b88b0201fbf65cd93072720627230353ef4a28 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 13:30:17 -0400 Subject: [PATCH 10/13] docling-requirement --- README.md | 48 +++++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 47 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index bfccdb6a..7d643727 100644 --- a/README.md +++ b/README.md @@ -59,13 +59,42 @@ They deploy the same applications and containers, but to different environments. - [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available. +Both Docker deployments depend on `docling serve` to be running on port `5001` on the host machine. This is required to take advantage of[Mac MLX](https://opensource.apple.com/projects/mlx/) support for document processing. Installing OpenRAG with the TUI starts `docling serve` automatically, but for a Docker deployment you must manually start the `docling serve` process. + +Alternatively, set `DISABLE_INGEST_WITH_LANGFLOW=true` in your `.env` to use OpenRAG's built-in pipeline, which uses docling directly without requiring `docling serve`. + +To deploy OpenRAG with Docker: + 1. Clone the OpenRAG repository. ```bash git clone https://github.com/langflow-ai/openrag.git cd openrag ``` -2. Build and start all services. +2. Install dependencies. + ```bash + uv sync + ``` + +3. Start `docling serve` on the host machine. + ```bash + uv run python scripts/docling_ctl.py start --port 5001 + ``` + +4. Confirm `docling serve` is running. + ``` + uv run python scripts/docling_ctl.py status + ``` + + Successful result: + ```bash + Status: running + Endpoint: http://127.0.0.1:5001 + Docs: http://127.0.0.1:5001/docs + PID: 27746 + ``` + +5. Build and start all services. For the GPU-accelerated deployment, run: ```bash @@ -78,6 +107,23 @@ They deploy the same applications and containers, but to different environments. docker compose -f docker-compose-cpu.yml up -d ``` + The OpenRAG Docker Compose file starts five containers: + | Container Name | Default Address | Purpose | + |---|---|---| + | OpenRAG Backend | http://localhost:8000 | FastAPI server and core functionality. | + | OpenRAG Frontend | http://localhost:3000 | React web interface for users. | + | Langflow | http://localhost:7860 | AI workflow engine and flow management. | + | OpenSearch | http://localhost:9200 | Vector database for document storage. | + | OpenSearch Dashboards | http://localhost:5601 | Database administration interface. | + + You can now access the OpenRAG application at `http://localhost:3000`. + + To stop `docling serve`, run: + + ```bash + uv run python scripts/docling_ctl.py stop + ``` + For more information, see [Deploy with Docker](docs/docs/get-started/docker.mdx). ## Troubleshooting From a2608d0281dc503fdae93ec6b27f622836639dbb Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 13:34:01 -0400 Subject: [PATCH 11/13] style --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 7d643727..a237b7ed 100644 --- a/README.md +++ b/README.md @@ -59,7 +59,7 @@ They deploy the same applications and containers, but to different environments. - [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available. -Both Docker deployments depend on `docling serve` to be running on port `5001` on the host machine. This is required to take advantage of[Mac MLX](https://opensource.apple.com/projects/mlx/) support for document processing. Installing OpenRAG with the TUI starts `docling serve` automatically, but for a Docker deployment you must manually start the `docling serve` process. +Both Docker deployments depend on `docling serve` to be running on port `5001` on the host machine. This enables [Mac MLX](https://opensource.apple.com/projects/mlx/) support for document processing. Installing OpenRAG with the TUI starts `docling serve` automatically, but for a Docker deployment you must manually start the `docling serve` process. Alternatively, set `DISABLE_INGEST_WITH_LANGFLOW=true` in your `.env` to use OpenRAG's built-in pipeline, which uses docling directly without requiring `docling serve`. @@ -118,7 +118,7 @@ To deploy OpenRAG with Docker: You can now access the OpenRAG application at `http://localhost:3000`. - To stop `docling serve`, run: +To stop `docling serve`, run: ```bash uv run python scripts/docling_ctl.py stop From 6b8ff56e6fbe7726822b0512e086458855c44282 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 13:37:00 -0400 Subject: [PATCH 12/13] shorten-sentence --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index a237b7ed..77ce3191 100644 --- a/README.md +++ b/README.md @@ -116,10 +116,10 @@ To deploy OpenRAG with Docker: | OpenSearch | http://localhost:9200 | Vector database for document storage. | | OpenSearch Dashboards | http://localhost:5601 | Database administration interface. | - You can now access the OpenRAG application at `http://localhost:3000`. - -To stop `docling serve`, run: +6. Access the OpenRAG application at `http://localhost:3000` and continue with the [Quickstart](docs/docs/get-started/quickstart.mdx). + To stop `docling serve`, run: + ```bash uv run python scripts/docling_ctl.py stop ``` From bd21001b6457ee9576a9d9eb4a6143d2e6c83590 Mon Sep 17 00:00:00 2001 From: Mendon Kissling <59585235+mendonk@users.noreply.github.com> Date: Wed, 8 Oct 2025 13:45:45 -0400 Subject: [PATCH 13/13] remove-env-var-option --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index 77ce3191..a7abbbe6 100644 --- a/README.md +++ b/README.md @@ -61,8 +61,6 @@ They deploy the same applications and containers, but to different environments. Both Docker deployments depend on `docling serve` to be running on port `5001` on the host machine. This enables [Mac MLX](https://opensource.apple.com/projects/mlx/) support for document processing. Installing OpenRAG with the TUI starts `docling serve` automatically, but for a Docker deployment you must manually start the `docling serve` process. -Alternatively, set `DISABLE_INGEST_WITH_LANGFLOW=true` in your `.env` to use OpenRAG's built-in pipeline, which uses docling directly without requiring `docling serve`. - To deploy OpenRAG with Docker: 1. Clone the OpenRAG repository.