215 lines
No EOL
10 KiB
Text
215 lines
No EOL
10 KiB
Text
---
|
|
title: Quickstart
|
|
slug: /quickstart
|
|
---
|
|
|
|
import Icon from "@site/src/components/icon/icon";
|
|
import Tabs from '@theme/Tabs';
|
|
import TabItem from '@theme/TabItem';
|
|
|
|
Get started with OpenRAG by loading your knowledge, swapping out your language model, and then chatting with the Langflow API.
|
|
|
|
## Prerequisites
|
|
|
|
:::tip
|
|
This quickstart uses a minimal setup to demonstrate OpenRAG's core functionality. After you complete the quickstart, it is recommended that you reinstall OpenRAG with your preferred configuration because some settings are immutable after initial setup. For all installation options, see [Install OpenRAG with TUI](/install) and [Install OpenRAG with containers](/docker).
|
|
:::
|
|
|
|
Install OpenRAG with the automatic installer.
|
|
The script detects and installs uv, Docker/Podman, and Docker Compose prerequisites, and then starts OpenRAG with `uvx`.
|
|
|
|
1. Create a directory to store the OpenRAG configuration files:
|
|
```bash
|
|
mkdir openrag-workspace
|
|
cd openrag-workspace
|
|
```
|
|
|
|
2. Run the installer:
|
|
```bash
|
|
curl -fsSL https://docs.openr.ag/files/run_openrag_with_prereqs.sh | bash
|
|
```
|
|
|
|
The TUI creates a `.env` file and docker-compose files in the current working directory, and then starts OpenRAG.
|
|
|
|
3. Select **Basic Setup**.
|
|
4. To generate a password for OpenSearch, click **Generate Passwords**.
|
|
The other fields aren't required.
|
|
5. To start OpenRAG, click **Start All Services**.
|
|
Startup pulls container images and runs them, so it can take some time.
|
|
When startup is complete, the TUI displays the following:
|
|
```bash
|
|
Services started successfully
|
|
Command completed successfully
|
|
```
|
|
6. To open the OpenRAG application, navigate to the TUI main menu, and then click **Open App**.
|
|
Alternatively, in your browser, navigate to `localhost:3000`.
|
|
7. For your model provider, select **OpenAI**.
|
|
8. In the **OpenAI API key** field, paste your OpenAI API key.
|
|
The default model settings are fine for the quickstart.
|
|
9. To confirm your provider settings, click **Complete**.
|
|
10. To complete onboarding, click **What is OpenRAG**, and then click **Add a document**.
|
|
Alternatively, click <Icon name="ArrowRight" aria-hidden="true"/> **Skip overview**.
|
|
|
|
To quit OpenRAG, navigate to the TUI main menu and press <kbd>q</kbd>.
|
|
To start OpenRAG again, run `uvx openrag`.
|
|
|
|
## Load and chat with your own documents
|
|
|
|
1. In OpenRAG, click <Icon name="MessageSquare" aria-hidden="true"/> **Chat**.
|
|
The chat is powered by the OpenRAG OpenSearch Agent.
|
|
For more information, see [Langflow in OpenRAG](/agents).
|
|
2. Ask `What documents are available to you?`
|
|
The agent responds with a message summarizing the documents that OpenRAG loads by default.
|
|
Knowledge is stored in OpenSearch.
|
|
For more information, see [OpenSearch in OpenRAG](/knowledge).
|
|
3. To confirm the agent is correct about the default knowledge, click <Icon name="Library" aria-hidden="true"/> **Knowledge**.
|
|
The **Knowledge** page lists the documents OpenRAG has ingested into the OpenSearch vector database.
|
|
Click on a document to display the chunks derived from splitting the default documents into the OpenSearch vector database.
|
|
4. To add documents to your knowledge base, click **Add Knowledge**.
|
|
* Select <Icon name="File" aria-hidden="true"/> **File** to add a single file from your local machine.
|
|
* Select <Icon name="Folder" aria-hidden="true"/> **Folder** to process an entire folder of documents from your local machine. The default directory is `/documents` in your OpenRAG directory.
|
|
* Select your cloud storage provider to add knowledge from an OAuth-connected storage provider. For more information, see [OAuth ingestion](/knowledge#oauth-ingestion).
|
|
5. Return to the Chat window and ask a question about your loaded data.
|
|
For example, with a manual about a PC tablet loaded, ask `How do I connect this device to WiFi?`
|
|
The agent responds with a message indicating it now has your knowledge as context for answering questions.
|
|
6. Click <Icon name="Gear" aria-hidden="true"/> **Function Call: search_documents (tool_call)**.
|
|
This log describes how the agent uses tools.
|
|
This is helpful for troubleshooting when the agent isn't responding as expected.
|
|
|
|
## Swap out the language model to modify agent behavior {#change-components}
|
|
|
|
To modify the knowledge ingestion or Agent behavior, click <Icon name="Settings2" aria-hidden="true"/> **Settings**.
|
|
|
|
In this example, you'll try a different LLM to demonstrate how the Agent's response changes.
|
|
|
|
1. To edit the Agent's behavior, click **Edit in Langflow**.
|
|
You can more quickly access these parameters in the **Language model** and **Agent Instructions** fields in this page, but for illustration purposes, navigate to the Langflow visual builder.
|
|
To revert the flow to its initial state, click **Restore flow**.
|
|
2. OpenRAG warns you that you're entering Langflow. Click **Proceed**.
|
|
|
|
If Langflow requests login information, enter the `LANGFLOW_SUPERUSER` and `LANGFLOW_SUPERUSER_PASSWORD` from the `.env` file in your OpenRAG directory.
|
|
|
|
The OpenRAG OpenSearch Agent flow appears in a new browser window.
|
|

|
|
|
|
3. Find the **Language Model** component, and then change the **Model Name** field to a different OpenAI model.
|
|
4. Save your flow with <kbd>Command+S</kbd> (Mac) or <kbd>Ctrl+S</kbd> (Windows).
|
|
5. Return to the OpenRAG browser window, and start a new conversation by clicking <Icon name="Plus" aria-hidden="true"/> in the **Conversations** tab.
|
|
6. Ask the same question you asked before to see how the response differs between models.
|
|
|
|
## Integrate OpenRAG into your application
|
|
|
|
Langflow in OpenRAG includes pre-built flows that you can integrate into your applications using the [Langflow API](https://docs.langflow.org/api-reference-api-examples).
|
|
|
|
The Langflow API accepts Python, TypeScript, or curl requests to run flows and get responses. You can use these flows as-is or modify them to better suit your needs.
|
|
|
|
In this section, you'll run the OpenRAG OpenSearch Agent flow and get a response using the API.
|
|
|
|
1. To navigate to the OpenRAG OpenSearch Agent flow in Langflow, click <Icon name="Settings2" aria-hidden="true"/> **Settings**, and then click **Edit in Langflow** in the OpenRAG OpenSearch Agent flow.
|
|
2. Create a [Langflow API key](https://docs.langflow.org/api-keys-and-authentication).
|
|
|
|
A Langflow API key is a user-specific token you can use with Langflow.
|
|
It is **only** used for sending requests to the Langflow server.
|
|
It does **not** access OpenRAG.
|
|
|
|
To create a Langflow API key, do the following:
|
|
|
|
1. Open Langflow, click your user icon, and then select **Settings**.
|
|
2. Click **Langflow API Keys**, and then click <Icon name="Plus" aria-hidden="true"/> **Add New**.
|
|
3. Name your key, and then click **Create API Key**.
|
|
4. Copy the API key and store it securely.
|
|
|
|
3. Langflow includes code snippets for the request to the Langflow API.
|
|
To retrieve the code snippet, click **Share**, and then click **API access**.
|
|
|
|
The default code in the API access pane constructs a request with the Langflow server `url`, `headers`, and a `payload` of request data. The code snippets automatically include the `LANGFLOW_SERVER_ADDRESS` and `FLOW_ID` values for the flow.
|
|
|
|
<Tabs>
|
|
<TabItem value="python" label="Python">
|
|
|
|
```python
|
|
import requests
|
|
import os
|
|
import uuid
|
|
|
|
api_key = 'LANGFLOW_API_KEY'
|
|
url = "http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID" # The complete API endpoint URL for this flow
|
|
|
|
# Request payload configuration
|
|
payload = {
|
|
"output_type": "chat",
|
|
"input_type": "chat",
|
|
"input_value": "hello world!"
|
|
}
|
|
payload["session_id"] = str(uuid.uuid4())
|
|
|
|
headers = {"x-api-key": api_key}
|
|
|
|
try:
|
|
# Send API request
|
|
response = requests.request("POST", url, json=payload, headers=headers)
|
|
response.raise_for_status() # Raise exception for bad status codes
|
|
|
|
# Print response
|
|
print(response.text)
|
|
|
|
except requests.exceptions.RequestException as e:
|
|
print(f"Error making API request: {e}")
|
|
except ValueError as e:
|
|
print(f"Error parsing response: {e}")
|
|
```
|
|
|
|
</TabItem>
|
|
<TabItem value="typescript" label="TypeScript">
|
|
|
|
```typescript
|
|
const crypto = require('crypto');
|
|
const apiKey = 'LANGFLOW_API_KEY';
|
|
const payload = {
|
|
"output_type": "chat",
|
|
"input_type": "chat",
|
|
"input_value": "hello world!"
|
|
};
|
|
payload.session_id = crypto.randomUUID();
|
|
|
|
const options = {
|
|
method: 'POST',
|
|
headers: {
|
|
'Content-Type': 'application/json',
|
|
"x-api-key": apiKey
|
|
},
|
|
body: JSON.stringify(payload)
|
|
};
|
|
|
|
fetch('http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID', options)
|
|
.then(response => response.json())
|
|
.then(response => console.warn(response))
|
|
.catch(err => console.error(err));
|
|
```
|
|
|
|
</TabItem>
|
|
<TabItem value="curl" label="curl">
|
|
|
|
```bash
|
|
curl --request POST \
|
|
--url 'http://LANGFLOW_SERVER_ADDRESS/api/v1/run/FLOW_ID?stream=false' \
|
|
--header 'Content-Type: application/json' \
|
|
--header "x-api-key: LANGFLOW_API_KEY" \
|
|
--data '{
|
|
"output_type": "chat",
|
|
"input_type": "chat",
|
|
"input_value": "hello world!"
|
|
}'
|
|
```
|
|
|
|
</TabItem>
|
|
</Tabs>
|
|
|
|
4. Copy the snippet, paste it in a script file, and then run the script to send the request. If you are using the curl snippet, you can run the command directly in your terminal.
|
|
|
|
If the request is successful, the response includes many details about the flow run, including the session ID, inputs, outputs, components, durations, and more.
|
|
|
|
To further explore the API, see:
|
|
|
|
* The Langflow [Quickstart](https://docs.langflow.org/quickstart#extract-data-from-the-response) extends this example with extracting fields from the response.
|
|
* [Get started with the Langflow API](https://docs.langflow.org/api-reference-api-examples) |