Replace Kuzu with FalkorDB as default database
BREAKING CHANGE: Kuzu is no longer supported. FalkorDB is now the default. - Renamed Dockerfile.falkordb-combined to Dockerfile (default) - Renamed docker-compose-falkordb-combined.yml to docker-compose.yml (default) - Updated config.yaml to use FalkorDB with localhost:6379 as default - Removed Kuzu from pyproject.toml dependencies (now only falkordb extra) - Updated Dockerfile to use graphiti-core[falkordb] instead of [kuzu,falkordb] - Completely removed all Kuzu references from README - Updated README to document FalkorDB combined container as default - Docker Compose now starts single container with FalkorDB + MCP server - Prerequisites now require Docker instead of Python for default setup - Removed old Kuzu docker-compose files Running from command line now requires external FalkorDB instance at localhost:6379
This commit is contained in:
parent
d1bb8554a6
commit
b9ac3efb69
7 changed files with 138 additions and 261 deletions
|
|
@ -20,7 +20,7 @@ The Graphiti MCP server provides comprehensive knowledge graph capabilities:
|
|||
- **Search Capabilities**: Search for facts (edges) and node summaries using semantic and hybrid search
|
||||
- **Group Management**: Organize and manage groups of related data with group_id filtering
|
||||
- **Graph Maintenance**: Clear the graph and rebuild indices
|
||||
- **Graph Database Support**: Multiple backend options including Kuzu (default), Neo4j, and FalkorDB
|
||||
- **Graph Database Support**: Multiple backend options including FalkorDB (default) and Neo4j
|
||||
- **Multiple LLM Providers**: Support for OpenAI, Anthropic, Gemini, Groq, and Azure OpenAI
|
||||
- **Multiple Embedding Providers**: Support for OpenAI, Voyage, Sentence Transformers, and Gemini embeddings
|
||||
- **Rich Entity Types**: Built-in entity types including Preferences, Requirements, Procedures, Locations, Events, Organizations, Documents, and more for structured knowledge extraction
|
||||
|
|
@ -59,16 +59,17 @@ cd graphiti && pwd
|
|||
|
||||
`cd graphiti/mcp_server`
|
||||
|
||||
2. Option A: Run with default Kuzu database (no Docker required)
|
||||
2. Start the combined FalkorDB + MCP server using Docker Compose (recommended)
|
||||
|
||||
```bash
|
||||
uv run graphiti_mcp_server.py
|
||||
docker compose up
|
||||
```
|
||||
|
||||
3. Option B: Run with Neo4j using Docker Compose
|
||||
This starts both FalkorDB and the MCP server in a single container.
|
||||
|
||||
**Alternative**: Run with separate containers using Neo4j:
|
||||
```bash
|
||||
docker compose up # or docker compose -f docker/docker-compose-neo4j.yml up
|
||||
docker compose -f docker/docker-compose-neo4j.yml up
|
||||
```
|
||||
|
||||
4. Point your MCP client to `http://localhost:8000/mcp/`
|
||||
|
|
@ -77,9 +78,9 @@ docker compose up # or docker compose -f docker/docker-compose-neo4j.yml up
|
|||
|
||||
### Prerequisites
|
||||
|
||||
1. Ensure you have Python 3.10 or higher installed.
|
||||
1. Docker and Docker Compose (for the default FalkorDB setup)
|
||||
2. OpenAI API key for LLM operations (or API keys for other supported LLM providers)
|
||||
3. (Optional) A running Neo4j or FalkorDB instance if you prefer not to use the default Kuzu database
|
||||
3. (Optional) Python 3.10+ if running the MCP server standalone with an external FalkorDB instance
|
||||
|
||||
### Setup
|
||||
|
||||
|
|
@ -105,23 +106,24 @@ The server can be configured using a `config.yaml` file, environment variables,
|
|||
|
||||
The MCP server comes with sensible defaults:
|
||||
- **Transport**: HTTP (accessible at `http://localhost:8000/mcp/`)
|
||||
- **Database**: Kuzu (in-memory, no external dependencies required)
|
||||
- **LLM**: OpenAI with model gpt-4.1
|
||||
- **Database**: FalkorDB (combined in single container with MCP server)
|
||||
- **LLM**: OpenAI with model gpt-5-mini
|
||||
- **Embedder**: OpenAI text-embedding-3-small
|
||||
|
||||
### Database Configuration
|
||||
|
||||
#### Kuzu (Default)
|
||||
#### FalkorDB (Default)
|
||||
|
||||
Kuzu is an embedded in-memory graph database requiring no external services. While archived by its original authors, we use it as the default for its simplicity and zero-dependency setup. We hope the community continues to maintain this project.
|
||||
FalkorDB is a Redis-based graph database that comes bundled with the MCP server in a single Docker container. This is the default and recommended setup.
|
||||
|
||||
```yaml
|
||||
database:
|
||||
provider: "kuzu" # Default - no additional setup required
|
||||
provider: "falkordb" # Default
|
||||
providers:
|
||||
kuzu:
|
||||
db: ":memory:" # In-memory database (default)
|
||||
# Or use a persistent file: db: "/path/to/database.kuzu"
|
||||
falkordb:
|
||||
uri: "redis://localhost:6379"
|
||||
password: "" # Optional
|
||||
database: "default_db" # Optional
|
||||
```
|
||||
|
||||
#### Neo4j
|
||||
|
|
@ -166,7 +168,7 @@ llm:
|
|||
model: "gpt-4.1" # Default model
|
||||
|
||||
database:
|
||||
provider: "kuzu" # Default. Options: "neo4j", "falkordb"
|
||||
provider: "falkordb" # Default. Options: "falkordb", "neo4j"
|
||||
```
|
||||
|
||||
### Using Ollama for Local LLM
|
||||
|
|
@ -241,18 +243,19 @@ You can set these variables in a `.env` file in the project directory.
|
|||
|
||||
## Running the Server
|
||||
|
||||
### Default Setup (Kuzu Database)
|
||||
### Default Setup (FalkorDB Combined Container)
|
||||
|
||||
To run the Graphiti MCP server with the default Kuzu in-memory database:
|
||||
To run the Graphiti MCP server with the default FalkorDB setup:
|
||||
|
||||
```bash
|
||||
uv run graphiti_mcp_server.py
|
||||
docker compose up
|
||||
```
|
||||
|
||||
This starts the server with:
|
||||
This starts a single container with:
|
||||
- HTTP transport on `http://localhost:8000/mcp/`
|
||||
- Kuzu in-memory database (no external dependencies)
|
||||
- OpenAI LLM with gpt-4.1 model
|
||||
- FalkorDB graph database on `localhost:6379`
|
||||
- FalkorDB web UI on `http://localhost:3000`
|
||||
- OpenAI LLM with gpt-5-mini model
|
||||
|
||||
### Running with Neo4j
|
||||
|
||||
|
|
@ -316,7 +319,7 @@ uv run graphiti_mcp_server.py --config config/config-docker-falkordb.yaml
|
|||
- `--config`: Path to YAML configuration file (default: config.yaml)
|
||||
- `--llm-provider`: LLM provider to use (openai, anthropic, gemini, groq, azure_openai)
|
||||
- `--embedder-provider`: Embedder provider to use (openai, azure_openai, gemini, voyage)
|
||||
- `--database-provider`: Database provider to use (kuzu, neo4j, falkordb) - default: kuzu
|
||||
- `--database-provider`: Database provider to use (falkordb, neo4j) - default: falkordb
|
||||
- `--model`: Model name to use with the LLM client
|
||||
- `--temperature`: Temperature setting for the LLM (0.0-2.0)
|
||||
- `--transport`: Choose the transport method (http or stdio, default: http)
|
||||
|
|
@ -370,17 +373,17 @@ Before running Docker Compose, configure your API keys using a `.env` file (reco
|
|||
cd graphiti/mcp_server
|
||||
```
|
||||
|
||||
##### Option 1: Kuzu Database (Default, No External Database)
|
||||
##### Option 1: FalkorDB Combined Container (Default)
|
||||
|
||||
Uses Kuzu in-memory database - fastest and simplest option:
|
||||
Single container with both FalkorDB and MCP server - simplest option:
|
||||
|
||||
```bash
|
||||
docker compose -f docker/docker-compose.yml up
|
||||
docker compose up
|
||||
```
|
||||
|
||||
##### Option 2: Neo4j Database
|
||||
|
||||
Includes a Neo4j container with persistent storage:
|
||||
Separate containers with Neo4j and MCP server:
|
||||
|
||||
```bash
|
||||
docker compose -f docker/docker-compose-neo4j.yml up
|
||||
|
|
@ -392,9 +395,9 @@ Default Neo4j credentials:
|
|||
- Bolt URI: `bolt://neo4j:7687`
|
||||
- Browser UI: `http://localhost:7474`
|
||||
|
||||
##### Option 3: FalkorDB Database (Separate Containers)
|
||||
##### Option 3: FalkorDB with Separate Containers
|
||||
|
||||
Includes separate FalkorDB and MCP server containers:
|
||||
Alternative setup with separate FalkorDB and MCP server containers:
|
||||
|
||||
```bash
|
||||
docker compose -f docker/docker-compose-falkordb.yml up
|
||||
|
|
@ -405,22 +408,6 @@ FalkorDB configuration:
|
|||
- Web UI: `http://localhost:3000`
|
||||
- Connection: `redis://falkordb:6379`
|
||||
|
||||
##### Option 4: FalkorDB + MCP Server (Combined Image)
|
||||
|
||||
Single container with both FalkorDB and MCP server bundled together:
|
||||
|
||||
```bash
|
||||
docker compose -f docker/docker-compose-falkordb-combined.yml up
|
||||
```
|
||||
|
||||
This combined setup offers:
|
||||
- Simplified deployment (one container to manage)
|
||||
- Reduced network latency (localhost communication)
|
||||
- Easier development workflow
|
||||
- Unified logging via Supervisor
|
||||
|
||||
See [docker/README-falkordb-combined.md](docker/README-falkordb-combined.md) for detailed documentation.
|
||||
|
||||
#### Accessing the MCP Server
|
||||
|
||||
Once running, the MCP server is available at:
|
||||
|
|
@ -626,7 +613,8 @@ The Graphiti MCP Server uses HTTP transport (at endpoint `/mcp/`). Claude Deskto
|
|||
- Python 3.10 or higher
|
||||
- OpenAI API key (for LLM operations and embeddings) or other LLM provider API keys
|
||||
- MCP-compatible client
|
||||
- (Optional) Neo4j database (version 5.26 or later) or FalkorDB if not using default Kuzu
|
||||
- Docker and Docker Compose (for the default FalkorDB combined container)
|
||||
- (Optional) Neo4j database (version 5.26 or later) if not using the default FalkorDB setup
|
||||
|
||||
## Telemetry
|
||||
|
||||
|
|
|
|||
|
|
@ -67,24 +67,20 @@ embedder:
|
|||
model: "voyage-3"
|
||||
|
||||
database:
|
||||
provider: "kuzu" # Options: neo4j, falkordb, kuzu
|
||||
|
||||
provider: "falkordb" # Default: falkordb. Options: neo4j, falkordb
|
||||
|
||||
providers:
|
||||
falkordb:
|
||||
uri: ${FALKORDB_URI:redis://localhost:6379}
|
||||
password: ${FALKORDB_PASSWORD:}
|
||||
database: ${FALKORDB_DATABASE:default_db}
|
||||
|
||||
neo4j:
|
||||
uri: ${NEO4J_URI:bolt://localhost:7687}
|
||||
username: ${NEO4J_USER:neo4j}
|
||||
password: ${NEO4J_PASSWORD}
|
||||
database: ${NEO4J_DATABASE:neo4j}
|
||||
use_parallel_runtime: ${USE_PARALLEL_RUNTIME:false}
|
||||
|
||||
falkordb:
|
||||
uri: ${FALKORDB_URI:redis://localhost:6379}
|
||||
password: ${FALKORDB_PASSWORD:}
|
||||
database: ${FALKORDB_DATABASE:default_db}
|
||||
|
||||
kuzu:
|
||||
db: ${KUZU_DB::memory:}
|
||||
max_concurrent_queries: ${KUZU_MAX_CONCURRENT_QUERIES:1}
|
||||
|
||||
graphiti:
|
||||
group_id: ${GRAPHITI_GROUP_ID:main}
|
||||
|
|
|
|||
|
|
@ -1,15 +1,21 @@
|
|||
# syntax=docker/dockerfile:1.9
|
||||
FROM python:3.12-slim
|
||||
# syntax=docker/dockerfile:1
|
||||
# Combined FalkorDB + Graphiti MCP Server Image
|
||||
# This extends the official FalkorDB image to include the MCP server
|
||||
|
||||
WORKDIR /app
|
||||
FROM falkordb/falkordb:latest AS falkordb-base
|
||||
|
||||
# Install system dependencies
|
||||
# Install Python and system dependencies
|
||||
# Note: Debian Bookworm (FalkorDB base) ships with Python 3.11
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
curl \
|
||||
ca-certificates \
|
||||
procps \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install uv using the installer script
|
||||
# Install uv for Python package management
|
||||
ADD https://astral.sh/uv/install.sh /uv-installer.sh
|
||||
RUN sh /uv-installer.sh && rm /uv-installer.sh
|
||||
|
||||
|
|
@ -23,55 +29,91 @@ ENV UV_COMPILE_BYTECODE=1 \
|
|||
MCP_SERVER_HOST="0.0.0.0" \
|
||||
PYTHONUNBUFFERED=1
|
||||
|
||||
# Create non-root user
|
||||
RUN groupadd -r app && useradd -r -d /app -g app app
|
||||
# Set up MCP server directory
|
||||
WORKDIR /app/mcp
|
||||
|
||||
# Accept graphiti-core version as build argument (defaults to latest compatible version)
|
||||
# Accept graphiti-core version as build argument
|
||||
ARG GRAPHITI_CORE_VERSION=0.22.0
|
||||
|
||||
# Copy project files for dependency installation (better caching)
|
||||
# Copy project files for dependency installation
|
||||
COPY pyproject.toml uv.lock ./
|
||||
|
||||
# Remove the local path override for graphiti-core in Docker builds
|
||||
# Pin to specific version if GRAPHITI_CORE_VERSION is provided, otherwise use >=0.16.0
|
||||
RUN sed -i '/\[tool\.uv\.sources\]/,/graphiti-core/d' pyproject.toml && \
|
||||
if [ -n "${GRAPHITI_CORE_VERSION}" ]; then \
|
||||
sed -i "s/graphiti-core\[kuzu,falkordb\]>=0\.16\.0/graphiti-core[kuzu,falkordb]==${GRAPHITI_CORE_VERSION}/" pyproject.toml; \
|
||||
sed -i "s/graphiti-core\[falkordb\]>=0\.16\.0/graphiti-core[falkordb]==${GRAPHITI_CORE_VERSION}/" pyproject.toml; \
|
||||
fi
|
||||
|
||||
# Install dependencies with explicit graphiti-core version
|
||||
# Install Python dependencies
|
||||
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||
uv sync --no-dev
|
||||
|
||||
# Store graphiti-core version in a file for runtime access
|
||||
RUN echo "${GRAPHITI_CORE_VERSION}" > /app/.graphiti-core-version
|
||||
# Store graphiti-core version
|
||||
RUN echo "${GRAPHITI_CORE_VERSION}" > /app/mcp/.graphiti-core-version
|
||||
|
||||
# Copy application code and configuration
|
||||
# Copy MCP server application code
|
||||
COPY main.py ./
|
||||
COPY src/ ./src/
|
||||
COPY config/ ./config/
|
||||
|
||||
# Set execute permissions on main.py and change ownership to app user
|
||||
RUN chmod +x /app/main.py && chown -Rv app:app /app
|
||||
# Copy FalkorDB combined config (uses localhost since both services in same container)
|
||||
COPY config/config-docker-falkordb-combined.yaml /app/mcp/config/config.yaml
|
||||
|
||||
# Create log and data directories
|
||||
RUN mkdir -p /var/log/graphiti /var/lib/falkordb/data
|
||||
|
||||
# Create startup script that runs both services
|
||||
RUN cat > /start-services.sh <<'EOF'
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
# Start FalkorDB in background using the correct module path
|
||||
echo "Starting FalkorDB..."
|
||||
redis-server \
|
||||
--loadmodule /var/lib/falkordb/bin/falkordb.so \
|
||||
--protected-mode no \
|
||||
--bind 0.0.0.0 \
|
||||
--port 6379 \
|
||||
--dir /var/lib/falkordb/data \
|
||||
--daemonize yes
|
||||
|
||||
# Wait for FalkorDB to be ready
|
||||
echo "Waiting for FalkorDB to be ready..."
|
||||
until redis-cli -h localhost -p 6379 ping > /dev/null 2>&1; do
|
||||
echo "FalkorDB not ready yet, waiting..."
|
||||
sleep 1
|
||||
done
|
||||
echo "FalkorDB is ready!"
|
||||
|
||||
# Start MCP server in foreground
|
||||
echo "Starting MCP server..."
|
||||
cd /app/mcp
|
||||
exec /root/.local/bin/uv run main.py
|
||||
EOF
|
||||
|
||||
RUN chmod +x /start-services.sh
|
||||
|
||||
# Add Docker labels with version information
|
||||
ARG MCP_SERVER_VERSION=1.0.0rc0
|
||||
ARG BUILD_DATE
|
||||
ARG VCS_REF
|
||||
LABEL org.opencontainers.image.title="Graphiti MCP Server" \
|
||||
org.opencontainers.image.description="MCP server for Graphiti knowledge graph" \
|
||||
LABEL org.opencontainers.image.title="FalkorDB + Graphiti MCP Server" \
|
||||
org.opencontainers.image.description="Combined FalkorDB graph database with Graphiti MCP server" \
|
||||
org.opencontainers.image.version="${MCP_SERVER_VERSION}" \
|
||||
org.opencontainers.image.created="${BUILD_DATE}" \
|
||||
org.opencontainers.image.revision="${VCS_REF}" \
|
||||
org.opencontainers.image.vendor="Zep AI" \
|
||||
org.opencontainers.image.source="https://github.com/zep-ai/graphiti" \
|
||||
graphiti.core.version="$(cat /app/.graphiti-core-version)"
|
||||
graphiti.core.version="${GRAPHITI_CORE_VERSION}"
|
||||
|
||||
# Switch to non-root user
|
||||
USER app
|
||||
# Expose ports
|
||||
EXPOSE 6379 3000 8000
|
||||
|
||||
# Expose port
|
||||
EXPOSE 8000
|
||||
# Health check - verify FalkorDB is responding
|
||||
# MCP server startup is logged and visible in container output
|
||||
HEALTHCHECK --interval=10s --timeout=5s --start-period=15s --retries=3 \
|
||||
CMD redis-cli -p 6379 ping > /dev/null || exit 1
|
||||
|
||||
# Command to run the application
|
||||
CMD ["uv", "run", "main.py"]
|
||||
# Override the FalkorDB entrypoint and use our startup script
|
||||
ENTRYPOINT ["/start-services.sh"]
|
||||
CMD []
|
||||
|
|
|
|||
|
|
@ -1,119 +0,0 @@
|
|||
# syntax=docker/dockerfile:1
|
||||
# Combined FalkorDB + Graphiti MCP Server Image
|
||||
# This extends the official FalkorDB image to include the MCP server
|
||||
|
||||
FROM falkordb/falkordb:latest AS falkordb-base
|
||||
|
||||
# Install Python and system dependencies
|
||||
# Note: Debian Bookworm (FalkorDB base) ships with Python 3.11
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
python3 \
|
||||
python3-dev \
|
||||
python3-pip \
|
||||
curl \
|
||||
ca-certificates \
|
||||
procps \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install uv for Python package management
|
||||
ADD https://astral.sh/uv/install.sh /uv-installer.sh
|
||||
RUN sh /uv-installer.sh && rm /uv-installer.sh
|
||||
|
||||
# Add uv to PATH
|
||||
ENV PATH="/root/.local/bin:${PATH}"
|
||||
|
||||
# Configure uv for optimal Docker usage
|
||||
ENV UV_COMPILE_BYTECODE=1 \
|
||||
UV_LINK_MODE=copy \
|
||||
UV_PYTHON_DOWNLOADS=never \
|
||||
MCP_SERVER_HOST="0.0.0.0" \
|
||||
PYTHONUNBUFFERED=1
|
||||
|
||||
# Set up MCP server directory
|
||||
WORKDIR /app/mcp
|
||||
|
||||
# Accept graphiti-core version as build argument
|
||||
ARG GRAPHITI_CORE_VERSION=0.22.0
|
||||
|
||||
# Copy project files for dependency installation
|
||||
COPY pyproject.toml uv.lock ./
|
||||
|
||||
# Remove the local path override for graphiti-core in Docker builds
|
||||
RUN sed -i '/\[tool\.uv\.sources\]/,/graphiti-core/d' pyproject.toml && \
|
||||
if [ -n "${GRAPHITI_CORE_VERSION}" ]; then \
|
||||
sed -i "s/graphiti-core\[kuzu,falkordb\]>=0\.16\.0/graphiti-core[kuzu,falkordb]==${GRAPHITI_CORE_VERSION}/" pyproject.toml; \
|
||||
fi
|
||||
|
||||
# Install Python dependencies
|
||||
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||
uv sync --no-dev
|
||||
|
||||
# Store graphiti-core version
|
||||
RUN echo "${GRAPHITI_CORE_VERSION}" > /app/mcp/.graphiti-core-version
|
||||
|
||||
# Copy MCP server application code
|
||||
COPY main.py ./
|
||||
COPY src/ ./src/
|
||||
COPY config/ ./config/
|
||||
|
||||
# Copy FalkorDB combined config (uses localhost since both services in same container)
|
||||
COPY config/config-docker-falkordb-combined.yaml /app/mcp/config/config.yaml
|
||||
|
||||
# Create log and data directories
|
||||
RUN mkdir -p /var/log/graphiti /var/lib/falkordb/data
|
||||
|
||||
# Create startup script that runs both services
|
||||
RUN cat > /start-services.sh <<'EOF'
|
||||
#!/bin/bash
|
||||
set -e
|
||||
|
||||
# Start FalkorDB in background using the correct module path
|
||||
echo "Starting FalkorDB..."
|
||||
redis-server \
|
||||
--loadmodule /var/lib/falkordb/bin/falkordb.so \
|
||||
--protected-mode no \
|
||||
--bind 0.0.0.0 \
|
||||
--port 6379 \
|
||||
--dir /var/lib/falkordb/data \
|
||||
--daemonize yes
|
||||
|
||||
# Wait for FalkorDB to be ready
|
||||
echo "Waiting for FalkorDB to be ready..."
|
||||
until redis-cli -h localhost -p 6379 ping > /dev/null 2>&1; do
|
||||
echo "FalkorDB not ready yet, waiting..."
|
||||
sleep 1
|
||||
done
|
||||
echo "FalkorDB is ready!"
|
||||
|
||||
# Start MCP server in foreground
|
||||
echo "Starting MCP server..."
|
||||
cd /app/mcp
|
||||
exec /root/.local/bin/uv run main.py
|
||||
EOF
|
||||
|
||||
RUN chmod +x /start-services.sh
|
||||
|
||||
# Add Docker labels with version information
|
||||
ARG MCP_SERVER_VERSION=1.0.0rc0
|
||||
ARG BUILD_DATE
|
||||
ARG VCS_REF
|
||||
LABEL org.opencontainers.image.title="FalkorDB + Graphiti MCP Server" \
|
||||
org.opencontainers.image.description="Combined FalkorDB graph database with Graphiti MCP server" \
|
||||
org.opencontainers.image.version="${MCP_SERVER_VERSION}" \
|
||||
org.opencontainers.image.created="${BUILD_DATE}" \
|
||||
org.opencontainers.image.revision="${VCS_REF}" \
|
||||
org.opencontainers.image.vendor="Zep AI" \
|
||||
org.opencontainers.image.source="https://github.com/zep-ai/graphiti" \
|
||||
graphiti.core.version="${GRAPHITI_CORE_VERSION}"
|
||||
|
||||
# Expose ports
|
||||
EXPOSE 6379 3000 8000
|
||||
|
||||
# Health check - verify FalkorDB is responding
|
||||
# MCP server startup is logged and visible in container output
|
||||
HEALTHCHECK --interval=10s --timeout=5s --start-period=15s --retries=3 \
|
||||
CMD redis-cli -p 6379 ping > /dev/null || exit 1
|
||||
|
||||
# Override the FalkorDB entrypoint and use our startup script
|
||||
ENTRYPOINT ["/start-services.sh"]
|
||||
CMD []
|
||||
|
|
@ -1,43 +0,0 @@
|
|||
services:
|
||||
graphiti-falkordb:
|
||||
image: zepai/graphiti-falkordb:latest
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: docker/Dockerfile.falkordb-combined
|
||||
args:
|
||||
GRAPHITI_CORE_VERSION: ${GRAPHITI_CORE_VERSION:-0.22.0}
|
||||
MCP_SERVER_VERSION: ${MCP_SERVER_VERSION:-1.0.0rc0}
|
||||
BUILD_DATE: ${BUILD_DATE:-}
|
||||
VCS_REF: ${VCS_REF:-}
|
||||
env_file:
|
||||
- path: ../.env
|
||||
required: false
|
||||
environment:
|
||||
# FalkorDB configuration
|
||||
- FALKORDB_PASSWORD=${FALKORDB_PASSWORD:-}
|
||||
# MCP Server configuration
|
||||
- FALKORDB_URI=redis://localhost:6379
|
||||
- FALKORDB_DATABASE=${FALKORDB_DATABASE:-default_db}
|
||||
- GRAPHITI_GROUP_ID=${GRAPHITI_GROUP_ID:-main}
|
||||
- SEMAPHORE_LIMIT=${SEMAPHORE_LIMIT:-10}
|
||||
- CONFIG_PATH=/app/mcp/config/config.yaml
|
||||
- PATH=/root/.local/bin:${PATH}
|
||||
volumes:
|
||||
- falkordb_data:/var/lib/falkordb/data
|
||||
- mcp_logs:/var/log/graphiti
|
||||
ports:
|
||||
- "6379:6379" # FalkorDB/Redis
|
||||
- "3000:3000" # FalkorDB web UI
|
||||
- "8000:8000" # MCP server HTTP
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "-p", "6379", "ping"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
start_period: 15s
|
||||
|
||||
volumes:
|
||||
falkordb_data:
|
||||
driver: local
|
||||
mcp_logs:
|
||||
driver: local
|
||||
|
|
@ -1,30 +1,43 @@
|
|||
services:
|
||||
graphiti-mcp:
|
||||
image: zepai/knowledge-graph-mcp:latest
|
||||
graphiti-falkordb:
|
||||
image: zepai/graphiti-falkordb:latest
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: docker/Dockerfile
|
||||
args:
|
||||
GRAPHITI_CORE_VERSION: ${GRAPHITI_CORE_VERSION:-0.22.0}
|
||||
MCP_SERVER_VERSION: ${MCP_SERVER_VERSION:-1.0.0rc0}
|
||||
BUILD_DATE: ${BUILD_DATE:-}
|
||||
VCS_REF: ${VCS_REF:-}
|
||||
env_file:
|
||||
- path: ../.env
|
||||
required: false
|
||||
environment:
|
||||
# Database configuration for KuzuDB - using persistent storage
|
||||
- KUZU_DB=/data/graphiti.kuzu
|
||||
- KUZU_MAX_CONCURRENT_QUERIES=10
|
||||
# Application configuration
|
||||
# FalkorDB configuration
|
||||
- FALKORDB_PASSWORD=${FALKORDB_PASSWORD:-}
|
||||
# MCP Server configuration
|
||||
- FALKORDB_URI=redis://localhost:6379
|
||||
- FALKORDB_DATABASE=${FALKORDB_DATABASE:-default_db}
|
||||
- GRAPHITI_GROUP_ID=${GRAPHITI_GROUP_ID:-main}
|
||||
- SEMAPHORE_LIMIT=${SEMAPHORE_LIMIT:-10}
|
||||
- CONFIG_PATH=/app/config/config.yaml
|
||||
- CONFIG_PATH=/app/mcp/config/config.yaml
|
||||
- PATH=/root/.local/bin:${PATH}
|
||||
volumes:
|
||||
- ../config/config-docker-kuzu.yaml:/app/config/config.yaml:ro
|
||||
# Persistent KuzuDB data storage
|
||||
- kuzu_data:/data
|
||||
- falkordb_data:/var/lib/falkordb/data
|
||||
- mcp_logs:/var/log/graphiti
|
||||
ports:
|
||||
- "8000:8000" # Expose the MCP server via HTTP transport
|
||||
command: ["uv", "run", "main.py"]
|
||||
- "6379:6379" # FalkorDB/Redis
|
||||
- "3000:3000" # FalkorDB web UI
|
||||
- "8000:8000" # MCP server HTTP
|
||||
healthcheck:
|
||||
test: ["CMD", "redis-cli", "-p", "6379", "ping"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
start_period: 15s
|
||||
|
||||
# Volume for persistent KuzuDB storage
|
||||
volumes:
|
||||
kuzu_data:
|
||||
driver: local
|
||||
falkordb_data:
|
||||
driver: local
|
||||
mcp_logs:
|
||||
driver: local
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ requires-python = ">=3.10,<4"
|
|||
dependencies = [
|
||||
"mcp>=1.9.4",
|
||||
"openai>=1.91.0",
|
||||
"graphiti-core[kuzu,falkordb]>=0.16.0",
|
||||
"graphiti-core[falkordb]>=0.16.0",
|
||||
"azure-identity>=1.21.0",
|
||||
"pydantic-settings>=2.0.0",
|
||||
"pyyaml>=6.0",
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue