354 lines
12 KiB
Markdown
354 lines
12 KiB
Markdown
# Graphiti MCP Server
|
|
|
|
Graphiti is a framework for building and querying temporally-aware knowledge graphs, specifically tailored for AI agents
|
|
operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) methods, Graphiti
|
|
continuously integrates user interactions, structured and unstructured enterprise data, and external information into a
|
|
coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical
|
|
queries without requiring complete graph recomputation, making it suitable for developing interactive, context-aware AI
|
|
applications.
|
|
|
|
This is an experimental Model Context Protocol (MCP) server implementation for Graphiti. The MCP server exposes
|
|
Graphiti's key functionality through the MCP protocol, allowing AI assistants to interact with Graphiti's knowledge
|
|
graph capabilities.
|
|
|
|
## Features
|
|
|
|
The Graphiti MCP server exposes the following key high-level functions of Graphiti:
|
|
|
|
- **Episode Management**: Add, retrieve, and delete episodes (text, messages, or JSON data)
|
|
- **Entity Management**: Search and manage entity nodes and relationships in the knowledge graph
|
|
- **Search Capabilities**: Search for facts (edges) and node summaries using semantic and hybrid search
|
|
- **Group Management**: Organize and manage groups of related data with group_id filtering
|
|
- **Graph Maintenance**: Clear the graph and rebuild indices
|
|
|
|
## Quick Start for Claude Desktop, Cursor, and other clients
|
|
|
|
1. Clone the Graphiti GitHub repo
|
|
|
|
```bash
|
|
git clone https://github.com/getzep/graphiti.git
|
|
```
|
|
|
|
or
|
|
|
|
```bash
|
|
gh repo clone getzep/graphiti
|
|
```
|
|
|
|
Note the full path to this directory.
|
|
|
|
```
|
|
cd graphiti && pwd
|
|
```
|
|
|
|
2. Install the [Graphiti prerequisites](#prerequisites).
|
|
|
|
3. Configure Claude, Cursor, or other MCP client to use [Graphiti with a `stdio` transport](#integrating-with-mcp-clients). See the client documentation on where to find their MCP configuration files.
|
|
|
|
## Installation
|
|
|
|
### Prerequisites
|
|
|
|
1. Ensure you have Python 3.10 or higher installed.
|
|
2. A running Neo4j database (version 5.26 or later required)
|
|
3. OpenAI API key for LLM operations
|
|
|
|
### Setup
|
|
|
|
1. Clone the repository and navigate to the mcp_server directory
|
|
2. Use `uv` to create a virtual environment and install dependencies:
|
|
|
|
```bash
|
|
# Install uv if you don't have it already
|
|
curl -LsSf https://astral.sh/uv/install.sh | sh
|
|
|
|
# Create a virtual environment and install dependencies in one step
|
|
uv sync
|
|
```
|
|
|
|
## Configuration
|
|
|
|
The server uses the following environment variables:
|
|
|
|
- `NEO4J_URI`: URI for the Neo4j database (default: `bolt://localhost:7687`)
|
|
- `NEO4J_USER`: Neo4j username (default: `neo4j`)
|
|
- `NEO4J_PASSWORD`: Neo4j password (default: `demodemo`)
|
|
- `OPENAI_API_KEY`: OpenAI API key (required for LLM operations)
|
|
- `OPENAI_BASE_URL`: Optional base URL for OpenAI API
|
|
- `MODEL_NAME`: OpenAI model name to use for LLM operations.
|
|
- `SMALL_MODEL_NAME`: OpenAI model name to use for smaller LLM operations.
|
|
- `LLM_TEMPERATURE`: Temperature for LLM responses (0.0-2.0).
|
|
- `AZURE_OPENAI_ENDPOINT`: Optional Azure OpenAI LLM endpoint URL
|
|
- `AZURE_OPENAI_DEPLOYMENT_NAME`: Optional Azure OpenAI LLM deployment name
|
|
- `AZURE_OPENAI_API_VERSION`: Optional Azure OpenAI LLM API version
|
|
- `AZURE_OPENAI_EMBEDDING_API_KEY`: Optional Azure OpenAI Embedding deployment key (if other than `OPENAI_API_KEY`)
|
|
- `AZURE_OPENAI_EMBEDDING_ENDPOINT`: Optional Azure OpenAI Embedding endpoint URL
|
|
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME`: Optional Azure OpenAI embedding deployment name
|
|
- `AZURE_OPENAI_EMBEDDING_API_VERSION`: Optional Azure OpenAI API version
|
|
- `AZURE_OPENAI_USE_MANAGED_IDENTITY`: Optional use Azure Managed Identities for authentication
|
|
- `SEMAPHORE_LIMIT`: Episode processing concurrency. See [Concurrency and LLM Provider 429 Rate Limit Errors](#concurrency-and-llm-provider-429-rate-limit-errors)
|
|
|
|
You can set these variables in a `.env` file in the project directory.
|
|
|
|
## Running the Server
|
|
|
|
To run the Graphiti MCP server directly using `uv`:
|
|
|
|
```bash
|
|
uv run graphiti_mcp_server.py
|
|
```
|
|
|
|
With options:
|
|
|
|
```bash
|
|
uv run graphiti_mcp_server.py --model gpt-4.1-mini --transport sse
|
|
```
|
|
|
|
Available arguments:
|
|
|
|
- `--model`: Overrides the `MODEL_NAME` environment variable.
|
|
- `--small-model`: Overrides the `SMALL_MODEL_NAME` environment variable.
|
|
- `--temperature`: Overrides the `LLM_TEMPERATURE` environment variable.
|
|
- `--transport`: Choose the transport method (sse or stdio, default: sse)
|
|
- `--group-id`: Set a namespace for the graph (optional). If not provided, defaults to "default".
|
|
- `--destroy-graph`: If set, destroys all Graphiti graphs on startup.
|
|
- `--use-custom-entities`: Enable entity extraction using the predefined ENTITY_TYPES
|
|
|
|
### Concurrency and LLM Provider 429 Rate Limit Errors
|
|
|
|
Graphiti's ingestion pipelines are designed for high concurrency, controlled by the `SEMAPHORE_LIMIT` environment variable.
|
|
By default, `SEMAPHORE_LIMIT` is set to `10` concurrent operations to help prevent `429` rate limit errors from your LLM provider. If you encounter such errors, try lowering this value.
|
|
|
|
If your LLM provider allows higher throughput, you can increase `SEMAPHORE_LIMIT` to boost episode ingestion performance.
|
|
|
|
### Docker Deployment
|
|
|
|
The Graphiti MCP server can be deployed using Docker. The Dockerfile uses `uv` for package management, ensuring
|
|
consistent dependency installation.
|
|
|
|
#### Environment Configuration
|
|
|
|
Before running the Docker Compose setup, you need to configure the environment variables. You have two options:
|
|
|
|
1. **Using a .env file** (recommended):
|
|
|
|
- Copy the provided `.env.example` file to create a `.env` file:
|
|
```bash
|
|
cp .env.example .env
|
|
```
|
|
- Edit the `.env` file to set your OpenAI API key and other configuration options:
|
|
```
|
|
# Required for LLM operations
|
|
OPENAI_API_KEY=your_openai_api_key_here
|
|
MODEL_NAME=gpt-4.1-mini
|
|
# Optional: OPENAI_BASE_URL only needed for non-standard OpenAI endpoints
|
|
# OPENAI_BASE_URL=https://api.openai.com/v1
|
|
```
|
|
- The Docker Compose setup is configured to use this file if it exists (it's optional)
|
|
|
|
2. **Using environment variables directly**:
|
|
- You can also set the environment variables when running the Docker Compose command:
|
|
```bash
|
|
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up
|
|
```
|
|
|
|
#### Neo4j Configuration
|
|
|
|
The Docker Compose setup includes a Neo4j container with the following default configuration:
|
|
|
|
- Username: `neo4j`
|
|
- Password: `demodemo`
|
|
- URI: `bolt://neo4j:7687` (from within the Docker network)
|
|
- Memory settings optimized for development use
|
|
|
|
#### Running with Docker Compose
|
|
|
|
Start the services using Docker Compose:
|
|
|
|
```bash
|
|
docker compose up
|
|
```
|
|
|
|
Or if you're using an older version of Docker Compose:
|
|
|
|
```bash
|
|
docker-compose up
|
|
```
|
|
|
|
This will start both the Neo4j database and the Graphiti MCP server. The Docker setup:
|
|
|
|
- Uses `uv` for package management and running the server
|
|
- Installs dependencies from the `pyproject.toml` file
|
|
- Connects to the Neo4j container using the environment variables
|
|
- Exposes the server on port 8000 for HTTP-based SSE transport
|
|
- Includes a healthcheck for Neo4j to ensure it's fully operational before starting the MCP server
|
|
|
|
## Integrating with MCP Clients
|
|
|
|
### Configuration
|
|
|
|
To use the Graphiti MCP server with an MCP-compatible client, configure it to connect to the server:
|
|
|
|
> [!IMPORTANT]
|
|
> You will need the Python package manager, `uv` installed. Please refer to the [`uv` install instructions](https://docs.astral.sh/uv/getting-started/installation/).
|
|
>
|
|
> Ensure that you set the full path to the `uv` binary and your Graphiti project folder.
|
|
|
|
```json
|
|
{
|
|
"mcpServers": {
|
|
"graphiti-memory": {
|
|
"transport": "stdio",
|
|
"command": "/Users/<user>/.local/bin/uv",
|
|
"args": [
|
|
"run",
|
|
"--isolated",
|
|
"--directory",
|
|
"/Users/<user>>/dev/zep/graphiti/mcp_server",
|
|
"--project",
|
|
".",
|
|
"graphiti_mcp_server.py",
|
|
"--transport",
|
|
"stdio"
|
|
],
|
|
"env": {
|
|
"NEO4J_URI": "bolt://localhost:7687",
|
|
"NEO4J_USER": "neo4j",
|
|
"NEO4J_PASSWORD": "password",
|
|
"OPENAI_API_KEY": "sk-XXXXXXXX",
|
|
"MODEL_NAME": "gpt-4.1-mini"
|
|
}
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
For SSE transport (HTTP-based), you can use this configuration:
|
|
|
|
```json
|
|
{
|
|
"mcpServers": {
|
|
"graphiti-memory": {
|
|
"transport": "sse",
|
|
"url": "http://localhost:8000/sse"
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
## Available Tools
|
|
|
|
The Graphiti MCP server exposes the following tools:
|
|
|
|
- `add_episode`: Add an episode to the knowledge graph (supports text, JSON, and message formats)
|
|
- `search_nodes`: Search the knowledge graph for relevant node summaries
|
|
- `search_facts`: Search the knowledge graph for relevant facts (edges between entities)
|
|
- `delete_entity_edge`: Delete an entity edge from the knowledge graph
|
|
- `delete_episode`: Delete an episode from the knowledge graph
|
|
- `get_entity_edge`: Get an entity edge by its UUID
|
|
- `get_episodes`: Get the most recent episodes for a specific group
|
|
- `clear_graph`: Clear all data from the knowledge graph and rebuild indices
|
|
- `get_status`: Get the status of the Graphiti MCP server and Neo4j connection
|
|
|
|
## Working with JSON Data
|
|
|
|
The Graphiti MCP server can process structured JSON data through the `add_episode` tool with `source="json"`. This
|
|
allows you to automatically extract entities and relationships from structured data:
|
|
|
|
```
|
|
|
|
add_episode(
|
|
name="Customer Profile",
|
|
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
|
|
source="json",
|
|
source_description="CRM data"
|
|
)
|
|
|
|
```
|
|
|
|
## Integrating with the Cursor IDE
|
|
|
|
To integrate the Graphiti MCP Server with the Cursor IDE, follow these steps:
|
|
|
|
1. Run the Graphiti MCP server using the SSE transport:
|
|
|
|
```bash
|
|
python graphiti_mcp_server.py --transport sse --use-custom-entities --group-id <your_group_id>
|
|
```
|
|
|
|
Hint: specify a `group_id` to namespace graph data. If you do not specify a `group_id`, the server will use "default" as the group_id.
|
|
|
|
or
|
|
|
|
```bash
|
|
docker compose up
|
|
```
|
|
|
|
2. Configure Cursor to connect to the Graphiti MCP server.
|
|
|
|
```json
|
|
{
|
|
"mcpServers": {
|
|
"graphiti-memory": {
|
|
"url": "http://localhost:8000/sse"
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
3. Add the Graphiti rules to Cursor's User Rules. See [cursor_rules.md](cursor_rules.md) for details.
|
|
|
|
4. Kick off an agent session in Cursor.
|
|
|
|
The integration enables AI assistants in Cursor to maintain persistent memory through Graphiti's knowledge graph
|
|
capabilities.
|
|
|
|
## Integrating with Claude Desktop (Docker MCP Server)
|
|
|
|
The Graphiti MCP Server container uses the SSE MCP transport. Claude Desktop does not natively support SSE, so you'll need to use a gateway like `mcp-remote`.
|
|
|
|
1. **Run the Graphiti MCP server using SSE transport**:
|
|
|
|
```bash
|
|
docker compose up
|
|
```
|
|
|
|
2. **(Optional) Install `mcp-remote` globally**:
|
|
If you prefer to have `mcp-remote` installed globally, or if you encounter issues with `npx` fetching the package, you can install it globally. Otherwise, `npx` (used in the next step) will handle it for you.
|
|
|
|
```bash
|
|
npm install -g mcp-remote
|
|
```
|
|
|
|
3. **Configure Claude Desktop**:
|
|
Open your Claude Desktop configuration file (usually `claude_desktop_config.json`) and add or modify the `mcpServers` section as follows:
|
|
|
|
```json
|
|
{
|
|
"mcpServers": {
|
|
"graphiti-memory": {
|
|
// You can choose a different name if you prefer
|
|
"command": "npx", // Or the full path to mcp-remote if npx is not in your PATH
|
|
"args": [
|
|
"mcp-remote",
|
|
"http://localhost:8000/sse" // Ensure this matches your Graphiti server's SSE endpoint
|
|
]
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
If you already have an `mcpServers` entry, add `graphiti-memory` (or your chosen name) as a new key within it.
|
|
|
|
4. **Restart Claude Desktop** for the changes to take effect.
|
|
|
|
## Requirements
|
|
|
|
- Python 3.10 or higher
|
|
- Neo4j database (version 5.26 or later required)
|
|
- OpenAI API key (for LLM operations and embeddings)
|
|
- MCP-compatible client
|
|
|
|
## License
|
|
|
|
This project is licensed under the same license as the parent Graphiti project.
|