graphiti/mcp_server/.env.example
0fism 0fa7865cee feat(mcp-server): add EMBEDDER_BASE_URL support for custom embedding endpoints
- Add base_url parameter to GraphitiEmbedderConfig
- Support EMBEDDER_BASE_URL, EMBEDDER_API_KEY, EMBEDDER_MODEL_NAME env vars
- Update docker-compose.yml to pass embedder environment variables
- Add documentation in README.md and .env.example
- Enables use of Ollama, Voyage, and other OpenAI-compatible embedding services

Partially resolves #912, #517
2025-10-16 01:05:22 +08:00

41 lines
1.6 KiB
Text

# Graphiti MCP Server Environment Configuration
# Neo4j Database Configuration
# These settings are used to connect to your Neo4j database
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=demodemo
# OpenAI API Configuration
# Required for LLM operations
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4.1-mini
# Embedding Service Configuration (Optional)
# Use these to configure a separate embedding service (e.g., Ollama, Voyage, or custom OpenAI-compatible service)
# EMBEDDER_API_KEY=your_embedder_api_key_here # Defaults to OPENAI_API_KEY if not set
# EMBEDDER_BASE_URL=http://localhost:11434/v1 # For Ollama or other OpenAI-compatible endpoints
# EMBEDDER_MODEL_NAME=nomic-embed-text # Model name for embedding service
# Optional: Only needed for non-standard OpenAI endpoints
# OPENAI_BASE_URL=https://api.openai.com/v1
# Optional: Group ID for namespacing graph data
# GROUP_ID=my_project
# Optional: Path configuration for Docker
# PATH=/root/.local/bin:${PATH}
# Optional: Memory settings for Neo4j (used in Docker Compose)
# NEO4J_server_memory_heap_initial__size=512m
# NEO4J_server_memory_heap_max__size=1G
# NEO4J_server_memory_pagecache_size=512m
# Azure OpenAI configuration
# Optional: Only needed for Azure OpenAI endpoints
# AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint_here
# AZURE_OPENAI_API_VERSION=2025-01-01-preview
# AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o-gpt-4o-mini-deployment
# AZURE_OPENAI_EMBEDDING_API_VERSION=2023-05-15
# AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME=text-embedding-3-large-deployment
# AZURE_OPENAI_USE_MANAGED_IDENTITY=false