- Add validation for Azure OpenAI API URL to ensure it's not None
- Fix OpenAI embedder config to use 'embedding_model' instead of 'model'
- Remove 'dimensions' parameter from OpenAIEmbedderConfig (not supported)
- Add URL validation for both LLM and embedder Azure configurations
All pyright type checks now pass with 0 errors.
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Add KuzuProviderConfig to schema for KuzuDB configuration
- Update DatabaseDriverFactory to support KuzuDB initialization
- Modify GraphitiService to handle KuzuDB driver instantiation
- Update CLI argument parser to include 'kuzu' option
- Make KuzuDB the default database provider (in-memory by default)
- Add docker-compose-kuzu.yml for standalone KuzuDB deployment
- Fix FalkorDB factory to parse URI into host/port parameters
- Fix EntityNode type access (use labels[0] instead of type)
- Add kuzu dependency to MCP server pyproject.toml
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Created isolated pytest.ini configuration for MCP server
- Added conftest.py to prevent loading parent project fixtures
- Added pytest and pytest-asyncio to dev dependencies
- Enabled pytest in CI workflow with proper environment variables
- Fixed asyncio test configuration
Pytest now runs successfully for MCP server tests without interfering
with the root project test configuration.
🤖 Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>
- Added norecursedirs and testpaths to pytest.ini to exclude mcp_server
- Also updated pyproject.toml with additional exclusion patterns
- Root project pytest now only discovers tests in the tests/ directory
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Fix async context manager in graph_data_operations.py clear_data function
- Correct import paths for Gemini and Voyage embedders in factories.py
- Move LLM provider dependencies to optional-dependencies section
- Add sentence-transformers as optional dependency for local embeddings
- Update README with optional dependency installation instructions
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
Major improvements to the Graphiti MCP server configuration:
Configuration System:
- Add YAML-based configuration with config.yaml
- Support environment variable expansion in YAML (${VAR_NAME} syntax)
- Implement hierarchical configuration: CLI > env > YAML > defaults
- Add pydantic-settings for robust configuration management
Multi-Provider Support:
- Add factory pattern for LLM clients (OpenAI, Anthropic, Gemini, Groq, Azure)
- Add factory pattern for embedder clients (OpenAI, Azure, Gemini, Voyage)
- Add factory pattern for database drivers (Neo4j, FalkorDB)
- Graceful handling of unavailable providers
Code Improvements:
- Refactor main server to use unified configuration system
- Remove obsolete graphiti_service.py with hardcoded Neo4j configs
- Clean up deprecated type hints and fix all lint issues
- Add comprehensive test suite for configuration loading
Documentation:
- Update README with concise configuration instructions
- Add VS Code integration example
- Remove overly verbose separate documentation
Docker Updates:
- Update Dockerfile to include config.yaml
- Enhance docker-compose.yml with provider environment variables
- Support configuration volume mounting
Breaking Changes:
- None - full backward compatibility maintained
- All existing CLI arguments and environment variables still work
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
* Update dependencies in pyproject.toml and uv.lock: bump graphiti-core to 0.14.0, add backoff and posthog packages, and update related version specifications.
* Update Dockerfile to change ownership command for app directory to verbose mode
* update graphiti dep
* - Introduce SEMAPHORE_LIMIT environment variable in graphiti_mcp_server.py to manage concurrency and mitigate 429 rate limit errors.
- Document SEMAPHORE_LIMIT usage in README.md for better user guidance.
* Log the concurrency limit in Graphiti initialization for better visibility.
* Document SEMAPHORE_LIMIT environment variable in README.md to clarify its role in managing episode processing concurrency and handling 429 rate limit errors.
* Refactor group_id handling and update dependencies
- Changed default behavior for `group_id` to 'default' instead of generating a UUID.
- Updated README to reflect the new default behavior for `--group-id`.
- Reformatted LLMConfig initialization for better readability.
- Bumped versions of several dependencies including `azure-core`, `azure-identity`, `certifi`, `charset-normalizer`, `sse-starlette`, and `typing-inspection`.
- Added `python-multipart` as a new dependency.
This update improves usability and ensures compatibility with the latest library versions.
* Update Graphiti MCP server instructions and refactor method names for clarity
- Revised the welcome message to enhance clarity about Graphiti's functionality.
- Renamed methods for better understanding: `add_episode` to `add_memory`, `search_nodes` to `search_memory_nodes`, `search_facts` to `search_memory_facts`, and updated related docstrings to reflect these changes.
- Updated references to "knowledge graph" to "graph memory" for consistency throughout the codebase.
* Update README for Graphiti MCP server configuration and integration with Claude Desktop
- Changed server name from "graphiti" to "graphiti-memory" in configuration examples for clarity.
- Added instructions for running the Graphiti MCP server using Docker.
- Included detailed steps for integrating Claude Desktop with the Graphiti MCP server, including optional installation of `mcp-remote`.
- Enhanced overall documentation to improve user experience and understanding of the setup process.
* Enhance error handling in GeminiEmbedder and GeminiClient
- Added checks to raise exceptions when no embeddings or response text are returned, improving robustness.
- Included type ignore comments for mypy compatibility in embed_content calls.
* Update graphiti_core/embedder/gemini.py
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
* Update graphiti_core/llm_client/gemini_client.py
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
---------
Co-authored-by: ellipsis-dev[bot] <65095814+ellipsis-dev[bot]@users.noreply.github.com>
Enhance README.md with quick start guide for clients and update environment variable instructions. Update uv.lock to reflect package version changes and add new dependencies. Notable updates include bumping graphiti-core to 0.10.5 and adding graph-service as a dependency.
* add azure open ai dependency
* update readme
* Create support for azure open ai endpoints
* fix: formatting
---------
Co-authored-by: paulpaliychuk <pavlo.paliychuk.ca@gmail.com>