diff --git a/docs/docs/core-components/agents.mdx b/docs/docs/core-components/agents.mdx index 6702fff4..23d50016 100644 --- a/docs/docs/core-components/agents.mdx +++ b/docs/docs/core-components/agents.mdx @@ -58,6 +58,15 @@ This filter is the [Knowledge filter](/knowledge#create-knowledge-filters), and * An [**MCP Tools** component](https://docs.langflow.org/mcp-client) is connected to the Agent's **Tools** port. This component calls the [**OpenSearch URL Ingestion** flow](/ingestion#url-flow), which Langflow uses as a [Model Context Protocol (MCP) server](https://docs.langflow.org/mcp-server) to fetch content from URLs and store in OpenSearch. +### Nudges + +When you use the OpenRAG **Chat**, the **OpenRAG OpenSearch Nudges** flow runs in the background to pull additional context from your knowledge base and chat history. + +Nudges appear as prompts in the chat. +Click a nudge to accept it and provide the nudge's context to the OpenRAG **Chat** agent (the **OpenRAG OpenSearch Agent** flow). + +Like OpenRAG's other built-in flows, you can [inspect the flow in Langflow](#inspect-and-modify-flows), and you can customize it if you want to change the nudge behavior. + ## Inspect and modify flows {#inspect-and-modify-flows} All OpenRAG flows are designed to be modular, performant, and provider-agnostic. diff --git a/docs/docs/core-components/ingestion.mdx b/docs/docs/core-components/ingestion.mdx index 7095f343..ef122138 100644 --- a/docs/docs/core-components/ingestion.mdx +++ b/docs/docs/core-components/ingestion.mdx @@ -50,7 +50,7 @@ If OpenRAG detects that the local machine is running on macOS, OpenRAG uses the ### OpenSearch Ingestion flow -The **OpenSearch Ingestion** flow is the default knowledge ingestion flow in OpenRAG. When you **Add Knowledge** in OpenRAG, the OpenSearch Ingestion flow runs in the background. The flow ingests documents using Docling Serve to import and process documents. +The **OpenSearch Ingestion** flow is the default knowledge ingestion flow in OpenRAG. When you **Add Knowledge** in OpenRAG, the **OpenSearch Ingestion** flow runs in the background. The flow ingests documents using Docling Serve to import and process documents. If you [inspect the flow in Langflow](/agents#inspect-and-modify-flows), you'll see that it is comprised of ten components that work together to process and store documents in your knowledge base: diff --git a/docs/docs/core-components/knowledge.mdx b/docs/docs/core-components/knowledge.mdx index 40401781..05f1caec 100644 --- a/docs/docs/core-components/knowledge.mdx +++ b/docs/docs/core-components/knowledge.mdx @@ -26,9 +26,7 @@ To configure the knowledge ingestion pipeline parameters, see [Docling Ingestion ### Direct file ingestion - - -The **Knowledge Ingest** flow uses Langflow's [**File** component](https://docs.langflow.org/components-data#file) to split and embed files loaded from your local machine into the OpenSearch database. +The **OpenSearch Ingestion** flow uses Langflow's [**File** component](https://docs.langflow.org/components-data#file) to split and embed files loaded from your local machine into the OpenSearch database. The default path to your local folder is mounted from the `./documents` folder in your OpenRAG project directory to the `/app/documents/` directory inside the Docker container. Files added to the host or the container will be visible in both locations. To configure this location, modify the **Documents Paths** variable in either the TUI's [Advanced Setup](/install#setup) menu or in the `.env` used by Docker Compose. @@ -109,7 +107,7 @@ The **Knowledge** page lists the documents OpenRAG has ingested into the OpenSea To explore the raw contents of your knowledge base, click