This is a major refactoring of the MCP Server to support multiple providers through a YAML-based configuration system with factory pattern implementation. ## Key Changes ### Architecture Improvements - Modular configuration system with YAML-based settings - Factory pattern for LLM, Embedder, and Database providers - Support for multiple database backends (Neo4j, FalkorDB, KuzuDB) - Clean separation of concerns with dedicated service modules ### Provider Support - **LLM**: OpenAI, Anthropic, Gemini, Groq - **Embedders**: OpenAI, Voyage, Gemini, Anthropic, Sentence Transformers - **Databases**: Neo4j, FalkorDB, KuzuDB (new default) - Azure OpenAI support with AD authentication ### Configuration - YAML configuration with environment variable expansion - CLI argument overrides for runtime configuration - Multiple pre-configured Docker Compose setups - Proper boolean handling in environment variables ### Testing & CI - Comprehensive test suite with unit and integration tests - GitHub Actions workflows for linting and testing - Multi-database testing support ### Docker Support - Updated Docker images with multi-stage builds - Database-specific docker-compose configurations - Persistent volume support for all databases ### Bug Fixes - Fixed KuzuDB connectivity checks - Corrected Docker command paths - Improved error handling and logging - Fixed boolean environment variable expansion Co-authored-by: Claude <noreply@anthropic.com>
35 lines
1.2 KiB
Text
35 lines
1.2 KiB
Text
# Graphiti MCP Server Environment Configuration
|
|
|
|
# Neo4j Database Configuration
|
|
# These settings are used to connect to your Neo4j database
|
|
NEO4J_URI=bolt://localhost:7687
|
|
NEO4J_USER=neo4j
|
|
NEO4J_PASSWORD=demodemo
|
|
|
|
# OpenAI API Configuration
|
|
# Required for LLM operations
|
|
OPENAI_API_KEY=your_openai_api_key_here
|
|
MODEL_NAME=gpt-4.1-mini
|
|
|
|
# Optional: Only needed for non-standard OpenAI endpoints
|
|
# OPENAI_BASE_URL=https://api.openai.com/v1
|
|
|
|
# Optional: Group ID for namespacing graph data
|
|
# GROUP_ID=my_project
|
|
|
|
# Optional: Path configuration for Docker
|
|
# PATH=/root/.local/bin:${PATH}
|
|
|
|
# Optional: Memory settings for Neo4j (used in Docker Compose)
|
|
# NEO4J_server_memory_heap_initial__size=512m
|
|
# NEO4J_server_memory_heap_max__size=1G
|
|
# NEO4J_server_memory_pagecache_size=512m
|
|
|
|
# Azure OpenAI configuration
|
|
# Optional: Only needed for Azure OpenAI endpoints
|
|
# AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint_here
|
|
# AZURE_OPENAI_API_VERSION=2025-01-01-preview
|
|
# AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o-gpt-4o-mini-deployment
|
|
# AZURE_OPENAI_EMBEDDING_API_VERSION=2023-05-15
|
|
# AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME=text-embedding-3-large-deployment
|
|
# AZURE_OPENAI_USE_MANAGED_IDENTITY=false
|