graphiti/graphiti_core
supmo668 74a422369c feat: Add enhanced configuration system with multi-provider LLM support
This commit introduces a comprehensive configuration system that makes
Graphiti more flexible and easier to configure across different
providers and deployment environments.

## New Features

- **Unified Configuration**: New GraphitiConfig class with Pydantic validation
- **YAML Support**: Load configuration from .graphiti.yaml files
- **Multi-Provider Support**: Easy switching between OpenAI, Azure, Anthropic,
  Gemini, Groq, and LiteLLM
- **LiteLLM Integration**: Unified access to 100+ LLM providers
- **Factory Functions**: Automatic client creation from configuration
- **Full Backward Compatibility**: Existing code continues to work

## Configuration System

- graphiti_core/config/settings.py: Pydantic configuration classes
- graphiti_core/config/providers.py: Provider enumerations and defaults
- graphiti_core/config/factory.py: Factory functions for client creation

## LiteLLM Client

- graphiti_core/llm_client/litellm_client.py: New unified LLM client
- Support for Azure OpenAI, AWS Bedrock, Vertex AI, Ollama, vLLM, etc.
- Automatic structured output detection

## Documentation

- docs/CONFIGURATION.md: Comprehensive configuration guide
- examples/graphiti_config_example.yaml: Example configurations
- DOMAIN_AGNOSTIC_IMPROVEMENT_PLAN.md: Future improvement roadmap

## Tests

- tests/config/test_settings.py: 22 tests for configuration
- tests/config/test_factory.py: 12 tests for factories
- 33/34 tests passing (97%)

## Issues Addressed

- #1004: Azure OpenAI support
- #1006: Azure OpenAI reranker support
- #1007: vLLM/OpenAI-compatible provider stability
- #1074: Ollama embeddings support
- #995: Docker Azure OpenAI support

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-30 23:47:38 -08:00
..
config feat: Add enhanced configuration system with multi-provider LLM support 2025-11-30 23:47:38 -08:00
cross_encoder fix: replace deprecated gemini-2.5-flash-lite-preview with gemini-2.5-flash-lite (#1076) 2025-11-20 16:03:51 -08:00
driver fix: Handle EquivalentSchemaRuleAlreadyExists errors in Neo4j driver 2025-11-23 19:21:55 -08:00
embedder update summary character limit (#1073) 2025-11-18 17:16:02 -05:00
llm_client feat: Add enhanced configuration system with multi-provider LLM support 2025-11-30 23:47:38 -08:00
migrations cleanup (#894) 2025-09-05 11:30:46 -04:00
models Fix entity edge save (#1013) 2025-11-08 18:32:51 -08:00
prompts Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
search [Improvement] Add GraphID isolation support for FalkorDB multi-tenant architecture (#835) 2025-11-03 10:56:53 -05:00
telemetry feat: add telemetry with PostHog and update Docker configurations (#633) 2025-06-27 12:23:30 -07:00
utils update summary character limit (#1073) 2025-11-18 17:16:02 -05:00
__init__.py chore: Fix packaging (#38) 2024-08-25 10:07:50 -07:00
decorators.py [Improvement] Add GraphID isolation support for FalkorDB multi-tenant architecture (#835) 2025-11-03 10:56:53 -05:00
edges.py add search and graph operations interfaces (#984) 2025-10-07 13:34:37 -04:00
errors.py Add group ID validation and error handling (#618) 2025-06-24 09:33:54 -07:00
graph_queries.py Graph quality updates (#922) 2025-09-23 17:53:39 -04:00
graphiti.py feat: Add enhanced configuration system with multi-provider LLM support 2025-11-30 23:47:38 -08:00
graphiti_types.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
helpers.py fix-fulltext-syntax-error (#914) 2025-09-23 10:52:44 -04:00
nodes.py fix deprecated cypher pattern (#993) 2025-10-09 16:12:55 -04:00
py.typed Add py.typed file (#105) 2024-09-11 08:44:06 -04:00
tracer.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00