No description
Find a file
2025-10-10 21:56:36 -04:00
.github/workflows only run integration tests on PR not merge to main 2025-10-09 13:24:10 -04:00
assets adding tui screenshots 2025-09-10 13:06:47 -04:00
docs Merge branch 'main' into docs-opensearch-jwt-values 2025-10-08 12:50:17 -04:00
documents adds support to load files in document folder at startup 2025-09-03 13:41:53 -04:00
flows Merge branch 'main' into add-mcp-agent-flows 2025-10-06 17:29:06 -04:00
frontend persist dimensions 2025-10-10 14:16:58 -04:00
keys empty keys directory 2025-09-02 17:12:21 -04:00
scripts fix migration script 2025-10-10 10:05:27 -04:00
securityconfig fix: os add scroll permissions for delete functionality 2025-09-19 12:17:46 -04:00
src fix: TUI should not pull contianers on start, fixed image detection logic bug 2025-10-10 21:56:36 -04:00
tests claims 2025-10-07 15:07:29 -04:00
.dockerignore Add environment and build file exclusions to .dockerignore 2025-09-08 18:06:18 -03:00
.env.example podman fixes: bind docling-serve to 0.0, improve logging, support magic podman and docker hostnames 2025-10-08 13:56:52 -04:00
.gitignore tui copies flows and v0.1.17 update 2025-10-08 11:05:44 -04:00
.python-version take 0 2025-07-10 22:36:45 -04:00
CONTRIBUTING.md tui-quickstart 2025-10-08 11:59:36 -04:00
docker-compose-cpu.yml Merge branch 'main' into tui-improvements 2025-10-08 23:12:59 -04:00
docker-compose.yml Merge branch 'main' into tui-improvements 2025-10-08 23:12:59 -04:00
Dockerfile chown 2025-09-03 22:22:20 -04:00
Dockerfile.backend torch extra 2025-10-07 10:53:51 -04:00
Dockerfile.frontend change the parameter! 2025-09-08 15:42:20 -04:00
Dockerfile.langflow Update base image to langflow-nightly:1.6.3.dev1 2025-10-06 21:16:56 -04:00
Makefile post test jwt diag 2025-10-08 09:41:32 -04:00
MANIFEST.in MANIFEST.in 2025-10-07 12:34:07 -04:00
pyproject.toml v0.1.21 2025-10-09 13:14:23 -04:00
README.md remove-env-var-option 2025-10-08 13:45:45 -04:00
uv.lock v0.1.21 2025-10-09 13:14:23 -04:00
warm_up_docling.py doc processing knobs 2025-09-18 16:27:01 -04:00

OpenRAG

Langflow    OpenSearch    Starlette    Next.js    Ask DeepWiki

OpenRAG is a comprehensive Retrieval-Augmented Generation platform that enables intelligent document search and AI-powered conversations. Users can upload, process, and query documents through a chat interface backed by large language models and semantic search capabilities. The system utilizes Langflow for document ingestion, retrieval workflows, and intelligent nudges, providing a seamless RAG experience. Built with Starlette, Next.js, OpenSearch, and Langflow integration.

Quickstart   |   TUI Interface   |   Docker Deployment   |   Development   |   Troubleshooting

Quickstart

Use the OpenRAG Terminal User Interface (TUI) to manage your OpenRAG installation without complex command-line operations.

To launch OpenRAG with the TUI, do the following:

  1. Clone the OpenRAG repository.

    git clone https://github.com/langflow-ai/openrag.git
    cd openrag
    
  2. To start the TUI, from the repository root, run:

    # Install dependencies first
    uv sync
    
    # Launch the TUI
    uv run openrag
    

    The TUI opens and guides you through OpenRAG setup.

For the full TUI guide, see TUI.

Docker Deployment

If you prefer to use Docker to run OpenRAG, the repository includes two Docker Compose .yml files. They deploy the same applications and containers, but to different environments.

  • docker-compose.yml is an OpenRAG deployment for environments with GPU support. GPU support requires an NVIDIA GPU with CUDA support and compatible NVIDIA drivers installed on the OpenRAG host machine.

  • docker-compose-cpu.yml is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available.

Both Docker deployments depend on docling serve to be running on port 5001 on the host machine. This enables Mac MLX support for document processing. Installing OpenRAG with the TUI starts docling serve automatically, but for a Docker deployment you must manually start the docling serve process.

To deploy OpenRAG with Docker:

  1. Clone the OpenRAG repository.

    git clone https://github.com/langflow-ai/openrag.git
    cd openrag
    
  2. Install dependencies.

    uv sync
    
  3. Start docling serve on the host machine.

    uv run python scripts/docling_ctl.py start --port 5001
    
  4. Confirm docling serve is running.

    uv run python scripts/docling_ctl.py status
    

    Successful result:

    Status: running
    Endpoint: http://127.0.0.1:5001
    Docs: http://127.0.0.1:5001/docs
    PID: 27746
    
  5. Build and start all services.

    For the GPU-accelerated deployment, run:

    docker compose build
    docker compose up -d
    

    For environments without GPU support, run:

    docker compose -f docker-compose-cpu.yml up -d
    

    The OpenRAG Docker Compose file starts five containers:

    Container Name Default Address Purpose
    OpenRAG Backend http://localhost:8000 FastAPI server and core functionality.
    OpenRAG Frontend http://localhost:3000 React web interface for users.
    Langflow http://localhost:7860 AI workflow engine and flow management.
    OpenSearch http://localhost:9200 Vector database for document storage.
    OpenSearch Dashboards http://localhost:5601 Database administration interface.
  6. Access the OpenRAG application at http://localhost:3000 and continue with the Quickstart.

    To stop docling serve, run:

    uv run python scripts/docling_ctl.py stop
    

For more information, see Deploy with Docker.

Troubleshooting

For common issues and fixes, see Troubleshoot.

Development

For developers wanting to contribute to OpenRAG or set up a development environment, see CONTRIBUTING.md.