quickstart
This commit is contained in:
parent
7186184602
commit
a1f618ac8f
2 changed files with 43 additions and 234 deletions
|
|
@ -11,33 +11,30 @@ Get started with OpenRAG by loading your knowledge, swapping out your language m
|
|||
|
||||
## Prerequisites
|
||||
|
||||
- [Install and start OpenRAG](/install)
|
||||
- Install and start OpenRAG with the [TUI](/install) or [Docker](/get-started/docker)
|
||||
|
||||
## Find your way around
|
||||
## Load and chat with your own documents
|
||||
|
||||
1. In OpenRAG, click <Icon name="MessageSquare" aria-hidden="true"/> **Chat**.
|
||||
The chat is powered by the OpenRAG OpenSearch Agent.
|
||||
For more information, see [Langflow Agents](/agents).
|
||||
2. Ask `What documents are available to you?`
|
||||
The agent responds with a message summarizing the documents that OpenRAG loads by default, which are PDFs about evaluating data quality when using LLMs in health care.
|
||||
The agent responds with a message summarizing the documents that OpenRAG loads by default.
|
||||
Knowledge is stored in OpenSearch.
|
||||
For more information, see [Knowledge](/knowledge).
|
||||
3. To confirm the agent is correct, click <Icon name="Library" aria-hidden="true"/> **Knowledge**.
|
||||
3. To confirm the agent is correct about the default knowledge, click <Icon name="Library" aria-hidden="true"/> **Knowledge**.
|
||||
The **Knowledge** page lists the documents OpenRAG has ingested into the OpenSearch vector database.
|
||||
Click on a document to display the chunks derived from splitting the default documents into the vector database.
|
||||
|
||||
## Add your own knowledge
|
||||
|
||||
1. To add documents to your knowledge base, click <Icon name="Plus" aria-hidden="true"/> **Add Knowledge**.
|
||||
* Select **Add File** to add a single file from your local machine (mapped with the Docker volume mount).
|
||||
* Select **Process Folder** to process an entire folder of documents from your local machine (mapped with the Docker volume mount).
|
||||
4. To add documents to your knowledge base, click <Icon name="Plus" aria-hidden="true"/> **Add Knowledge**.
|
||||
* Select **Add File** to add a single file from your local machine.
|
||||
* Select **Process Folder** to process an entire folder of documents from your local machine.
|
||||
* Select your cloud storage provider to add knowledge from an OAuth-connected storage provider. For more information, see [OAuth ingestion](/knowledge#oauth-ingestion).
|
||||
2. Return to the Chat window and ask a question about your loaded data.
|
||||
5. Return to the Chat window and ask a question about your loaded data.
|
||||
For example, with a manual about a PC tablet loaded, ask `How do I connect this device to WiFI?`
|
||||
The agent responds with a message indicating it now has your knowledge as context for answering questions.
|
||||
3. Click the <Icon name="Gear" aria-hidden="true"/> **Function Call: search_documents (tool_call)** that is printed in the Playground.
|
||||
These events log the agent's request to the tool and the tool's response, so you have direct visibility into your agent's functionality.
|
||||
If you aren't getting the results you need, you can further tune the knowledge ingestion and agent behavior in the next section.
|
||||
6. Click <Icon name="Gear" aria-hidden="true"/> **Function Call: search_documents (tool_call)**.
|
||||
This log describes how the agent uses tools.
|
||||
This is helpful for troubleshooting when the agent isn't responding as expected.
|
||||
|
||||
## Swap out the language model to modify agent behavior {#change-components}
|
||||
|
||||
|
|
@ -48,58 +45,40 @@ In this example, you'll try a different LLM to demonstrate how the Agent's respo
|
|||
1. To edit the Agent's behavior, click **Edit in Langflow**.
|
||||
You can more quickly access the **Language Model** and **Agent Instructions** fields in this page, but for illustration purposes, navigate to the Langflow visual builder.
|
||||
2. OpenRAG warns you that you're entering Langflow. Click **Proceed**.
|
||||
|
||||
3. The OpenRAG OpenSearch Agent flow appears.
|
||||
The OpenRAG OpenSearch Agent flow appears in a new browser window.
|
||||

|
||||
|
||||
4. In the **Language Model** component, under **Model**, select a different OpenAI model.
|
||||
5. Save your flow with <kbd>Command+S</kbd>.
|
||||
6. In OpenRAG, start a new conversation by clicking the <Icon name="Plus" aria-hidden="true"/> in the **Conversations** tab.
|
||||
7. Ask the same question as before to demonstrate how a different language model changes the results.
|
||||
3. Find the **Language Model** component, and then change the **Model Name** field to a different OpenAI model.
|
||||
4. Save your flow with <kbd>Command+S</kbd> (Mac) or <kbd>Ctrl+S</kbd> (Windows).
|
||||
5. Return to the OpenRAG browser window, and start a new conversation by clicking <Icon name="Plus" aria-hidden="true"/> in the **Conversations** tab.
|
||||
6. Ask the same question you asked before to see how the response differs between models.
|
||||
|
||||
## Integrate OpenRAG into your application
|
||||
|
||||
To integrate OpenRAG into your application, use the [Langflow API](https://docs.langflow.org/api-reference-api-examples).
|
||||
Make requests with Python, TypeScript, or any HTTP client to run one of OpenRAG's default flows and get a response, and then modify the flow further to improve results. Langflow provides code snippets to help you get started.
|
||||
Langflow in OpenRAG includes pre-built flows that you can integrate into your applications using the [Langflow API](https://docs.langflow.org/api-reference-api-examples).
|
||||
|
||||
1. Create a [Langflow API key](https://docs.langflow.org/api-keys-and-authentication).
|
||||
<details>
|
||||
<summary>Create a Langflow API key</summary>
|
||||
|
||||
A Langflow API key is a user-specific token you can use with Langflow.
|
||||
It is **only** used for sending requests to the Langflow server.
|
||||
It does **not** access to OpenRAG.
|
||||
|
||||
To create a Langflow API key, do the following:
|
||||
|
||||
1. In Langflow, click your user icon, and then select **Settings**.
|
||||
2. Click **Langflow API Keys**, and then click <Icon name="Plus" aria-hidden="true"/> **Add New**.
|
||||
3. Name your key, and then click **Create API Key**.
|
||||
4. Copy the API key and store it securely.
|
||||
5. To use your Langflow API key in a request, set a `LANGFLOW_API_KEY` environment variable in your terminal, and then include an `x-api-key` header or query parameter with your request.
|
||||
For example:
|
||||
|
||||
```bash
|
||||
# Set variable
|
||||
export LANGFLOW_API_KEY="sk..."
|
||||
|
||||
# Send request
|
||||
curl --request POST \
|
||||
--url "http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID" \
|
||||
--header "Content-Type: application/json" \
|
||||
--header "x-api-key: $LANGFLOW_API_KEY" \
|
||||
--data '{
|
||||
"output_type": "chat",
|
||||
"input_type": "chat",
|
||||
"input_value": "Hello"
|
||||
}'
|
||||
```
|
||||
|
||||
</details>
|
||||
2. To navigate to the OpenRAG OpenSearch Agent flow, click <Icon name="Settings2" aria-hidden="true"/> **Settings**, and then click **Edit in Langflow** in the OpenRAG OpenSearch Agent flow.
|
||||
3. Click **Share**, and then click **API access**.
|
||||
The Langflow API accepts Python, TypeScript, or curl requests to run flows and get responses. You can use these flows as-is or modify them to better suit your needs.
|
||||
|
||||
The default code in the API access pane constructs a request with the Langflow server `url`, `headers`, and a `payload` of request data. The code snippets automatically include the `LANGFLOW_SERVER_ADDRESS` and `FLOW_ID` values for the flow. Replace these values if you're using the code for a different server or flow. The default Langflow server address is http://localhost:7860.
|
||||
In this section, you'll run the OpenRAG OpenSearch Agent flow and get a response using the API.
|
||||
|
||||
1. To navigate to the OpenRAG OpenSearch Agent flow in Langflow, click <Icon name="Settings2" aria-hidden="true"/> **Settings**, and then click **Edit in Langflow** in the OpenRAG OpenSearch Agent flow.
|
||||
2. Create a [Langflow API key](https://docs.langflow.org/api-keys-and-authentication).
|
||||
|
||||
A Langflow API key is a user-specific token you can use with Langflow.
|
||||
It is **only** used for sending requests to the Langflow server.
|
||||
It does **not** access OpenRAG.
|
||||
|
||||
To create a Langflow API key, do the following:
|
||||
|
||||
1. Open Langflow, click your user icon, and then select **Settings**.
|
||||
2. Click **Langflow API Keys**, and then click <Icon name="Plus" aria-hidden="true"/> **Add New**.
|
||||
3. Name your key, and then click **Create API Key**.
|
||||
4. Copy the API key and store it securely.
|
||||
|
||||
3. Langflow includes code snippets for the request to the Langflow API.
|
||||
To retrieve the code snippet, click **Share**, and then click **API access**.
|
||||
|
||||
The default code in the API access pane constructs a request with the Langflow server `url`, `headers`, and a `payload` of request data. The code snippets automatically include the `LANGFLOW_SERVER_ADDRESS` and `FLOW_ID` values for the flow.
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="python" label="Python">
|
||||
|
|
@ -171,12 +150,12 @@ Make requests with Python, TypeScript, or any HTTP client to run one of OpenRAG'
|
|||
curl --request POST \
|
||||
--url 'http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID?stream=false' \
|
||||
--header 'Content-Type: application/json' \
|
||||
--header "x-api-key: LANGFLOW_API_KEY" \
|
||||
--header "x-api-key: LANGFLOW_API_KEY" \
|
||||
--data '{
|
||||
"output_type": "chat",
|
||||
"input_type": "chat",
|
||||
"input_value": "hello world!",
|
||||
}'
|
||||
"output_type": "chat",
|
||||
"input_type": "chat",
|
||||
"input_value": "hello world!"
|
||||
}'
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
|
@ -185,176 +164,6 @@ Make requests with Python, TypeScript, or any HTTP client to run one of OpenRAG'
|
|||
4. Copy the snippet, paste it in a script file, and then run the script to send the request. If you are using the curl snippet, you can run the command directly in your terminal.
|
||||
|
||||
If the request is successful, the response includes many details about the flow run, including the session ID, inputs, outputs, components, durations, and more.
|
||||
The following is an example of a response from running the **Simple Agent** template flow:
|
||||
|
||||
<details>
|
||||
<summary>Result</summary>
|
||||
|
||||
```json
|
||||
{
|
||||
"session_id": "29deb764-af3f-4d7d-94a0-47491ed241d6",
|
||||
"outputs": [
|
||||
{
|
||||
"inputs": {
|
||||
"input_value": "hello world!"
|
||||
},
|
||||
"outputs": [
|
||||
{
|
||||
"results": {
|
||||
"message": {
|
||||
"text_key": "text",
|
||||
"data": {
|
||||
"timestamp": "2025-06-16 19:58:23 UTC",
|
||||
"sender": "Machine",
|
||||
"sender_name": "AI",
|
||||
"session_id": "29deb764-af3f-4d7d-94a0-47491ed241d6",
|
||||
"text": "Hello world! 🌍 How can I assist you today?",
|
||||
"files": [],
|
||||
"error": false,
|
||||
"edit": false,
|
||||
"properties": {
|
||||
"text_color": "",
|
||||
"background_color": "",
|
||||
"edited": false,
|
||||
"source": {
|
||||
"id": "Agent-ZOknz",
|
||||
"display_name": "Agent",
|
||||
"source": "gpt-4o-mini"
|
||||
},
|
||||
"icon": "bot",
|
||||
"allow_markdown": false,
|
||||
"positive_feedback": null,
|
||||
"state": "complete",
|
||||
"targets": []
|
||||
},
|
||||
"category": "message",
|
||||
"content_blocks": [
|
||||
{
|
||||
"title": "Agent Steps",
|
||||
"contents": [
|
||||
{
|
||||
"type": "text",
|
||||
"duration": 2,
|
||||
"header": {
|
||||
"title": "Input",
|
||||
"icon": "MessageSquare"
|
||||
},
|
||||
"text": "**Input**: hello world!"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"duration": 226,
|
||||
"header": {
|
||||
"title": "Output",
|
||||
"icon": "MessageSquare"
|
||||
},
|
||||
"text": "Hello world! 🌍 How can I assist you today?"
|
||||
}
|
||||
],
|
||||
"allow_markdown": true,
|
||||
"media_url": null
|
||||
}
|
||||
],
|
||||
"id": "f3d85d9a-261c-4325-b004-95a1bf5de7ca",
|
||||
"flow_id": "29deb764-af3f-4d7d-94a0-47491ed241d6",
|
||||
"duration": null
|
||||
},
|
||||
"default_value": "",
|
||||
"text": "Hello world! 🌍 How can I assist you today?",
|
||||
"sender": "Machine",
|
||||
"sender_name": "AI",
|
||||
"files": [],
|
||||
"session_id": "29deb764-af3f-4d7d-94a0-47491ed241d6",
|
||||
"timestamp": "2025-06-16T19:58:23+00:00",
|
||||
"flow_id": "29deb764-af3f-4d7d-94a0-47491ed241d6",
|
||||
"error": false,
|
||||
"edit": false,
|
||||
"properties": {
|
||||
"text_color": "",
|
||||
"background_color": "",
|
||||
"edited": false,
|
||||
"source": {
|
||||
"id": "Agent-ZOknz",
|
||||
"display_name": "Agent",
|
||||
"source": "gpt-4o-mini"
|
||||
},
|
||||
"icon": "bot",
|
||||
"allow_markdown": false,
|
||||
"positive_feedback": null,
|
||||
"state": "complete",
|
||||
"targets": []
|
||||
},
|
||||
"category": "message",
|
||||
"content_blocks": [
|
||||
{
|
||||
"title": "Agent Steps",
|
||||
"contents": [
|
||||
{
|
||||
"type": "text",
|
||||
"duration": 2,
|
||||
"header": {
|
||||
"title": "Input",
|
||||
"icon": "MessageSquare"
|
||||
},
|
||||
"text": "**Input**: hello world!"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"duration": 226,
|
||||
"header": {
|
||||
"title": "Output",
|
||||
"icon": "MessageSquare"
|
||||
},
|
||||
"text": "Hello world! 🌍 How can I assist you today?"
|
||||
}
|
||||
],
|
||||
"allow_markdown": true,
|
||||
"media_url": null
|
||||
}
|
||||
],
|
||||
"duration": null
|
||||
}
|
||||
},
|
||||
"artifacts": {
|
||||
"message": "Hello world! 🌍 How can I assist you today?",
|
||||
"sender": "Machine",
|
||||
"sender_name": "AI",
|
||||
"files": [],
|
||||
"type": "object"
|
||||
},
|
||||
"outputs": {
|
||||
"message": {
|
||||
"message": "Hello world! 🌍 How can I assist you today?",
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
"logs": {
|
||||
"message": []
|
||||
},
|
||||
"messages": [
|
||||
{
|
||||
"message": "Hello world! 🌍 How can I assist you today?",
|
||||
"sender": "Machine",
|
||||
"sender_name": "AI",
|
||||
"session_id": "29deb764-af3f-4d7d-94a0-47491ed241d6",
|
||||
"stream_url": null,
|
||||
"component_id": "ChatOutput-aF5lw",
|
||||
"files": [],
|
||||
"type": "text"
|
||||
}
|
||||
],
|
||||
"timedelta": null,
|
||||
"duration": null,
|
||||
"component_display_name": "Chat Output",
|
||||
"component_id": "ChatOutput-aF5lw",
|
||||
"used_frozen_result": false
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
</details>
|
||||
|
||||
To further explore the API, see:
|
||||
|
||||
|
|
|
|||
BIN
docs/static/img/opensearch-agent-flow.png
vendored
BIN
docs/static/img/opensearch-agent-flow.png
vendored
Binary file not shown.
|
Before Width: | Height: | Size: 1,004 KiB After Width: | Height: | Size: 1 MiB |
Loading…
Add table
Reference in a new issue