CRITICAL FIX - Database Parameter (graphiti_core):
- Fixed graphiti_core/driver/neo4j_driver.py execute_query method
- database_ parameter was incorrectly added to params dict instead of kwargs
- Now correctly passed as keyword argument to Neo4j driver
- Impact: All queries now execute in configured database (not default 'neo4j')
- Root cause: Violated Neo4j Python driver API contract
Technical Details:
Previous code (BROKEN):
params.setdefault('database_', self._database) # Wrong - in params dict
result = await self.client.execute_query(cypher_query_, parameters_=params, **kwargs)
Fixed code (CORRECT):
kwargs.setdefault('database_', self._database) # Correct - in kwargs
result = await self.client.execute_query(cypher_query_, parameters_=params, **kwargs)
FIX - Index Creation Error Handling (MCP server):
- Added graceful handling for Neo4j IF NOT EXISTS bug
- Prevents MCP server crash when indices already exist
- Logs warning instead of failing initialization
- Handles EquivalentSchemaRuleAlreadyExists error gracefully
Files Modified:
- graphiti_core/driver/neo4j_driver.py (3 lines changed)
- mcp_server/src/graphiti_mcp_server.py (12 lines added error handling)
- mcp_server/pyproject.toml (version bump to 1.0.5)
Testing:
- Python syntax validation: PASSED
- Ruff formatting: PASSED
- Ruff linting: PASSED
Closes issues with:
- Data being stored in wrong Neo4j database
- MCP server crashing on startup with EquivalentSchemaRuleAlreadyExists
- NEO4J_DATABASE environment variable being ignored
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
|
||
|---|---|---|
| .. | ||
| Archived | ||
| BACKLOG-Multi-User-Session-Isolation.md | ||
| BACKLOG-Neo4j-Database-Configuration-Fix.md | ||
| BACKLOG-OpenAI-Compatible-Endpoints.md | ||
| GitHub-DockerHub-Setup.md | ||
| LibreChat-Unraid-Stdio-Setup.md | ||
| Librechat.setup.md | ||
| MCP-Tool-Annotations-Examples.md | ||
| MCP-Tool-Annotations-Implementation-Plan.md | ||
| MCP-Tool-Descriptions-Final-Revision.md | ||
| MCP-Tool-Descriptions-REVISED.md | ||
| README.md | ||
Graphiti Custom Build Documentation
This directory contains documentation for building and deploying a custom Graphiti MCP server with your local changes.
Quick Navigation
🐳 Docker Build Setup
- Complete guide for automated Docker builds via GitHub Actions
- Builds with YOUR local graphiti-core changes (not PyPI)
- Pushes to Docker Hub (
lvarming/graphiti-mcp) - Start here if you want to build custom Docker images
🖥️ LibreChat Integration
- Complete setup guide for Graphiti MCP + LibreChat + Neo4j on Unraid
- Uses your custom Docker image from Docker Hub
- Step-by-step deployment instructions
🔌 OpenAI API Compatibility
OpenAI-Compatible-Endpoints.md
- Analysis of OpenAI-compatible endpoint support
- Explains
/v1/responsesvs/v1/chat/completionsissue - Recommendations for supporting OpenRouter, Together.ai, Ollama, etc.
Quick Start for Custom Builds
1. Setup GitHub → Docker Hub Pipeline
Follow GitHub-DockerHub-Setup.md to:
- Create Docker Hub access token
- Add token to GitHub repository secrets
- Push changes to trigger automatic build
2. Deploy on Unraid
Follow Librechat.setup.md to:
- Configure Neo4j connection
- Deploy Graphiti MCP container using
lvarming/graphiti-mcp:latest - Integrate with LibreChat
What's Different in This Setup?
Standard Graphiti Deployment
# Uses official image from PyPI
image: zepai/knowledge-graph-mcp:standalone
Your Custom Deployment
# Uses YOUR image with YOUR changes
image: lvarming/graphiti-mcp:latest
The custom image includes:
- ✅ Your local
graphiti-corechanges - ✅ Your MCP server modifications
- ✅ Both Neo4j and FalkorDB drivers
- ✅ Built automatically on every push
Files in This Repository
Workflow Files
.github/workflows/build-custom-mcp.yml- GitHub Actions workflow for automated builds
Docker Files
mcp_server/docker/Dockerfile.custom- Custom Dockerfile that uses local graphiti-core
Documentation
DOCS/GitHub-DockerHub-Setup.md- Docker build setup guideDOCS/Librechat.setup.md- LibreChat integration guideDOCS/OpenAI-Compatible-Endpoints.md- API compatibility analysisDOCS/README.md- This file
Workflow Overview
graph LR
A[Make Changes] --> B[Git Push]
B --> C[GitHub Actions]
C --> D[Build Docker Image]
D --> E[Push to Docker Hub]
E --> F[Deploy on Unraid]
F --> G[Use in LibreChat]
- Make Changes - Modify
graphiti_core/ormcp_server/ - Git Push - Push to
mainbranch on GitHub - GitHub Actions - Automatically triggered
- Build Image - Using
Dockerfile.customwith local code - Push to Docker Hub - Tagged as
lvarming/graphiti-mcp:latest - Deploy on Unraid - Pull latest image
- Use in LibreChat - Configure MCP server URL
Version Information
Your builds include comprehensive version tracking:
docker inspect lvarming/graphiti-mcp:latest | jq '.[0].Config.Labels'
Returns:
{
"org.opencontainers.image.title": "Graphiti MCP Server (Custom Build)",
"org.opencontainers.image.version": "1.0.0",
"graphiti.core.version": "0.23.0",
"graphiti.core.source": "local",
"org.opencontainers.image.revision": "abc1234",
"org.opencontainers.image.created": "2025-11-08T12:00:00Z"
}
Key Benefits
🚀 Automated
- No manual Docker builds
- No need to push images yourself
- Triggered automatically on code changes
🔄 Reproducible
- Every build is traced to a git commit
- Anyone can see exactly what was built
- Version labels include all metadata
🏗️ Multi-Platform
- Builds for AMD64 and ARM64
- Works on Intel, AMD, and Apple Silicon
- Single command works everywhere
🎯 Clean Workflow
- Professional CI/CD pipeline
- Follows industry best practices
- Easy to maintain and extend
Support
Issues with Docker Builds?
See GitHub-DockerHub-Setup.md - Troubleshooting
Issues with Deployment?
See Librechat.setup.md - Troubleshooting
Issues with API Compatibility?
See OpenAI-Compatible-Endpoints.md
Contributing
If you make improvements to these docs or workflows:
- Update the relevant documentation file
- Test the changes
- Commit and push
- (Optional) Share with the community via PR to upstream
License
This documentation follows the same license as the Graphiti project.