Merge branch 'main' into sdks

This commit is contained in:
Sebastián Estévez 2025-12-18 11:51:37 -05:00 committed by GitHub
commit 3098533a44
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
5 changed files with 1422 additions and 1658 deletions

View file

@ -89,7 +89,7 @@ For example: `What documents are available to you?`
The agent responds with a summary of OpenRAG's default documents.
3. To verify the agent's response, click <Icon name="Library" aria-hidden="true"/> **Knowledge** to view the documents stored in the OpenRAG OpenSearch vector database.
3. To verify the agent's response, click <Icon name="Library" aria-hidden="true"/> **Knowledge** to view the documents stored in the OpenRAG OpenSearch database.
You can click a document to view the chunks of the document as they are stored in the database.
4. Click **Add Knowledge** to add your own documents to your OpenRAG knowledge base.
@ -106,7 +106,7 @@ You can click a document to view the chunks of the document as they are stored i
* Click <Icon name="Gear" aria-hidden="true"/> **Function Call: search_documents (tool_call)** to view the log of tool calls made by the agent. This is helpful for troubleshooting because it shows you how the agent used particular tools.
* Click <Icon name="Library" aria-hidden="true"/> **Knowledge** to confirm that the documents are present in the OpenRAG OpenSearch vector database, and then click each document to see how the document was chunked.
* Click <Icon name="Library" aria-hidden="true"/> **Knowledge** to confirm that the documents are present in the OpenRAG OpenSearch database, and then click each document to see how the document was chunked.
If a document was chunked improperly, you might need to tweak the ingestion or modify and reupload the document.
* Click <Icon name="Settings2" aria-hidden="true"/> **Settings** to modify the knowledge ingestion settings.

View file

@ -4,11 +4,11 @@ slug: /
hide_table_of_contents: true
---
OpenRAG is an open-source package for building agentic RAG systems that integrates with a wide range of orchestration tools, vector databases, and LLM providers.
OpenRAG is an open-source package for building agentic RAG systems that integrates with a wide range of orchestration tools, databases, and LLM providers.
OpenRAG connects and amplifies three popular, proven open-source projects into one powerful platform:
* [Langflow](https://docs.langflow.org): Langflow is a versatile tool for building and deploying AI agents and MCP servers. It supports all major LLMs, vector databases, and a growing library of AI tools.
* [Langflow](https://docs.langflow.org): Langflow is a versatile tool for building and deploying AI agents and MCP servers. It supports all major LLMs, popular vector databases, and a growing library of AI tools.
OpenRAG uses several built-in flows, and it provides full access to all Langflow features through the embedded Langflow visual editor.
@ -17,7 +17,7 @@ OpenRAG connects and amplifies three popular, proven open-source projects into o
* [OpenSearch](https://docs.opensearch.org/latest/): OpenSearch is a community-driven, Apache 2.0-licensed open source search and analytics suite that makes it easy to ingest, search, visualize, and analyze data.
It provides powerful hybrid search capabilities with enterprise-grade security and multi-tenancy support.
OpenRAG uses OpenSearch as the underlying vector database for storing and retrieving your documents and associated vector data (embeddings). You can ingest documents from a variety of sources, including your local filesystem and OAuth authenticated connectors to popular cloud storage services.
OpenRAG uses OpenSearch as the underlying database for storing and retrieving your documents and associated vector data (embeddings). You can ingest documents from a variety of sources, including your local filesystem and OAuth authenticated connectors to popular cloud storage services.
* [Docling](https://docling-project.github.io/docling/): Docling simplifies document processing, supports many file formats and advanced PDF parsing, and provides seamless integrations with the generative AI ecosystem.
@ -59,7 +59,7 @@ flowchart TD
* **OpenRAG backend**: The central orchestration service that coordinates all other components.
* **Langflow**: This container runs a Langflow instance. It provides the embedded Langflow visual editor for editing and creating flow, and it connects to the **OpenSearch** container for vector storage and retrieval.
* **Langflow**: This container runs a Langflow instance. It provides the embedded Langflow visual editor for editing and creating flow, and it connects to the **OpenSearch** container for document storage and retrieval.
* **Docling Serve**: This is a local document processing service managed by the **OpenRAG backend**.

View file

@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "openrag"
version = "0.1.53"
version = "0.1.54"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"

View file

@ -96,7 +96,10 @@ class WelcomeScreen(Screen):
try:
# Use detected runtime command to check services
import subprocess
compose_cmd = self.container_manager.runtime_info.compose_command + ["ps", "--format", "json"]
compose_cmd = self.container_manager.runtime_info.compose_command + [
"-f", str(self.container_manager.compose_file),
"ps", "--format", "json"
]
result = subprocess.run(
compose_cmd,
capture_output=True,
@ -128,20 +131,38 @@ class WelcomeScreen(Screen):
# Check if services are running (exclude starting/created states)
# State can be lowercase or mixed case, so normalize it
running_services = []
starting_services = []
# Only consider expected services (filter out stale/leftover containers)
expected = set(self.container_manager.expected_services)
name_map = self.container_manager.container_name_map
running_services = set()
starting_services = set()
for s in services:
if not isinstance(s, dict):
continue
# Get service name - try compose label first (most reliable for Podman)
labels = s.get('Labels', {}) or {}
service_name = labels.get('com.docker.compose.service', '')
if not service_name:
# Fall back to container name mapping
container_name = s.get('Name') or s.get('Service', '')
if not container_name:
names = s.get('Names', [])
if names and isinstance(names, list):
container_name = names[0]
# Map container name to service name using container_name_map
service_name = name_map.get(container_name, container_name)
# Skip if not an expected service
if service_name not in expected:
continue
state = str(s.get('State', '')).lower()
if state == 'running':
running_services.append(s)
running_services.add(service_name)
elif 'starting' in state or 'created' in state:
starting_services.append(s)
# Only consider services running if we have running services AND no starting services
# This prevents showing the button when containers are still coming up
self.services_running = len(running_services) > 0 and len(starting_services) == 0
starting_services.add(service_name)
# Services are running if all expected services are in running state
# (i.e., we have all expected services running and none are still starting)
self.services_running = len(running_services) == len(expected) and len(starting_services) == 0
else:
self.services_running = False
except Exception:
@ -255,15 +276,15 @@ class WelcomeScreen(Screen):
# Check if services are running
if self.container_manager.is_available():
services = await self.container_manager.get_service_status()
expected = set(self.container_manager.expected_services)
running_services = [
s.name for s in services.values() if s.status == ServiceStatus.RUNNING
]
starting_services = [
s.name for s in services.values() if s.status == ServiceStatus.STARTING
]
# Only consider services running if we have running services AND no starting services
# This prevents showing the button when containers are still coming up
self.services_running = len(running_services) > 0 and len(starting_services) == 0
# Services are running if all expected services are in running state
self.services_running = len(running_services) == len(expected) and len(starting_services) == 0
else:
self.services_running = False

3021
uv.lock generated

File diff suppressed because it is too large Load diff