Added detailed documentation for SEMAPHORE_LIMIT configuration to help users optimize episode processing concurrency based on their LLM provider's rate limits.
Changes:
1. **graphiti_mcp_server.py**
- Expanded inline comments from 3 lines to 26 lines
- Added provider-specific tuning guidelines (OpenAI, Anthropic, Azure, Ollama)
- Documented symptoms of too-high/too-low settings
- Added monitoring recommendations
2. **README.md**
- Expanded "Concurrency and LLM Provider 429 Rate Limit Errors" section
- Added tier-specific recommendations for each provider
- Explained relationship between episode concurrency and LLM request rates
- Added troubleshooting symptoms and monitoring guidance
- Included example .env configuration
3. **config.yaml**
- Added header comment referencing detailed documentation
- Noted default value and suitable use case
4. **.env.example**
- Added SEMAPHORE_LIMIT with inline tuning guidelines
- Quick reference for all major LLM provider tiers
- Cross-reference to README for full details
Benefits:
- Users can now make informed decisions about concurrency settings
- Reduces likelihood of 429 rate limit errors from misconfiguration
- Helps users maximize throughput within their rate limits
- Provides clear troubleshooting guidance
Addresses PR #1024 review comment about magic number documentation.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
This is a major refactoring of the MCP Server to support multiple providers
through a YAML-based configuration system with factory pattern implementation.
## Key Changes
### Architecture Improvements
- Modular configuration system with YAML-based settings
- Factory pattern for LLM, Embedder, and Database providers
- Support for multiple database backends (Neo4j, FalkorDB, KuzuDB)
- Clean separation of concerns with dedicated service modules
### Provider Support
- **LLM**: OpenAI, Anthropic, Gemini, Groq
- **Embedders**: OpenAI, Voyage, Gemini, Anthropic, Sentence Transformers
- **Databases**: Neo4j, FalkorDB, KuzuDB (new default)
- Azure OpenAI support with AD authentication
### Configuration
- YAML configuration with environment variable expansion
- CLI argument overrides for runtime configuration
- Multiple pre-configured Docker Compose setups
- Proper boolean handling in environment variables
### Testing & CI
- Comprehensive test suite with unit and integration tests
- GitHub Actions workflows for linting and testing
- Multi-database testing support
### Docker Support
- Updated Docker images with multi-stage builds
- Database-specific docker-compose configurations
- Persistent volume support for all databases
### Bug Fixes
- Fixed KuzuDB connectivity checks
- Corrected Docker command paths
- Improved error handling and logging
- Fixed boolean environment variable expansion
Co-authored-by: Claude <noreply@anthropic.com>
* add azure open ai dependency
* update readme
* Create support for azure open ai endpoints
* fix: formatting
---------
Co-authored-by: paulpaliychuk <pavlo.paliychuk.ca@gmail.com>