graphiti/mcp_server/.env.example
donbr 4ab47c097e feat(mcp_server): Add FastMCP Cloud deployment support
Add tooling and configuration for deploying the Graphiti MCP server to
FastMCP Cloud with Neo4j or FalkorDB backends.

Changes:
- Add DATABASE_PROVIDER env var support in config.yaml for runtime
  database selection (neo4j or falkordb)
- Add EMBEDDING_DIM, EMBEDDER_PROVIDER, EMBEDDER_MODEL env var support
  for embedder configuration
- Add python-dotenv and pydantic to pyproject.toml dependencies
- Bump version to 1.0.2
- Rewrite .env.example with comprehensive documentation for both
  local development and FastMCP Cloud deployment
- Add verification script (scripts/verify_fastmcp_cloud_readiness.py)
  that checks 6 deployment prerequisites
- Add deployment guide (docs/FASTMCP_CLOUD_DEPLOYMENT.md)

The server can now be configured entirely via environment variables,
making it compatible with FastMCP Cloud which ignores config files.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-06 23:25:31 -08:00

128 lines
4.1 KiB
Text

# Graphiti MCP Server Environment Configuration
# =============================================
#
# For LOCAL development: Copy this file to .env and fill in your values
# For FASTMCP CLOUD: Set these in the FastMCP Cloud UI (NOT in .env files)
#
# FastMCP Cloud ignores .env files - you MUST set secrets in the Cloud UI
# =============================================================================
# DATABASE CONFIGURATION (Choose ONE provider)
# =============================================================================
# Database Provider Selection
# Options: neo4j, falkordb
DATABASE_PROVIDER=neo4j
# --- Neo4j Configuration ---
# For local development: bolt://localhost:7687
# For Neo4j Aura (cloud): neo4j+s://xxxxx.databases.neo4j.io
NEO4J_URI=bolt://localhost:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=your_neo4j_password_here
NEO4J_DATABASE=neo4j
# --- FalkorDB Configuration ---
# For local development: redis://localhost:6379
# For FalkorDB Cloud: redis://username:password@host:port
# FALKORDB_URI=redis://localhost:6379
# FALKORDB_PASSWORD=
# FALKORDB_DATABASE=default_db
# FALKORDB_USER=
# =============================================================================
# LLM PROVIDER CONFIGURATION (Required)
# =============================================================================
# OpenAI (Default)
OPENAI_API_KEY=your_openai_api_key_here
# Optional: Override default model
# LLM_MODEL=gpt-4.1-mini
# --- Alternative LLM Providers ---
# Anthropic
# ANTHROPIC_API_KEY=sk-ant-...
# Google Gemini
# GOOGLE_API_KEY=...
# GOOGLE_PROJECT_ID=
# GOOGLE_LOCATION=us-central1
# Groq
# GROQ_API_KEY=...
# Azure OpenAI
# AZURE_OPENAI_API_KEY=...
# AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
# AZURE_OPENAI_DEPLOYMENT=your-deployment-name
# AZURE_OPENAI_API_VERSION=2024-10-21
# USE_AZURE_AD=false
# =============================================================================
# EMBEDDER CONFIGURATION (Optional - defaults to OpenAI)
# =============================================================================
# Voyage AI (recommended by Anthropic for Claude integrations)
# VOYAGE_API_KEY=...
# Note: Voyage AI uses 1024 dimensions by default
# Embedding dimensions (must match your embedding model)
# OpenAI text-embedding-3-small: 1536
# Voyage AI voyage-3: 1024
# EMBEDDING_DIM=1536
# =============================================================================
# GRAPHITI CONFIGURATION
# =============================================================================
# Group ID for namespacing graph data
GRAPHITI_GROUP_ID=main
# User ID for tracking operations
USER_ID=mcp_user
# Episode ID prefix (optional)
# EPISODE_ID_PREFIX=
# =============================================================================
# PERFORMANCE TUNING
# =============================================================================
# Concurrency Control
# Controls how many episodes can be processed simultaneously
# Default: 10 (suitable for OpenAI Tier 3, mid-tier Anthropic)
#
# Adjust based on your LLM provider's rate limits:
# - OpenAI Tier 1 (free): 1-2
# - OpenAI Tier 2: 5-8
# - OpenAI Tier 3: 10-15
# - OpenAI Tier 4: 20-50
# - Anthropic default: 5-8
# - Anthropic high tier: 15-30
#
# See README.md "Concurrency and LLM Provider 429 Rate Limit Errors" for details
SEMAPHORE_LIMIT=10
# =============================================================================
# FASTMCP CLOUD DEPLOYMENT CHECKLIST
# =============================================================================
# When deploying to FastMCP Cloud, set these in the Cloud UI:
#
# REQUIRED:
# - OPENAI_API_KEY (or your chosen LLM provider key)
# - Database credentials (Neo4j or FalkorDB)
# - For Neo4j Aura: NEO4J_URI, NEO4J_USER, NEO4J_PASSWORD
# - For FalkorDB Cloud: FALKORDB_URI, FALKORDB_USER, FALKORDB_PASSWORD
#
# OPTIONAL:
# - DATABASE_PROVIDER (defaults to falkordb)
# - SEMAPHORE_LIMIT (defaults to 10)
# - GRAPHITI_GROUP_ID (defaults to main)
#
# IMPORTANT: FastMCP Cloud IGNORES:
# - .env files
# - config.yaml files
# - if __name__ == "__main__" blocks
# =============================================================================