graphiti/graphiti_core/llm_client
Daniel Chalef 55ef6acb16
Add Azure OpenAI example with Neo4j (#1064)
* Add Azure OpenAI example with Neo4j

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Convert Azure OpenAI example to use uv

- Remove requirements.txt (uv uses pyproject.toml)
- Update README to use 'uv sync' and 'uv run'

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Update Azure OpenAI example to use gpt-4.1

- Change default deployment from gpt-4 to gpt-4.1
- Update README recommendations to prioritize gpt-4.1 models

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Remove model recommendations from Azure OpenAI example

Model recommendations quickly become outdated.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Add default Neo4j credentials to docker-compose

Set sensible defaults (neo4j/password) to prevent NEO4J_AUTH error
when .env file is not present.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Update Azure OpenAI documentation to use v1 API

- Simplified Azure OpenAI setup using AsyncOpenAI with v1 endpoint
- Updated main README with clearer Quick Start example
- Removed outdated API version configuration
- Updated example deployment to gpt-5-mini
- Added note about v1 API endpoint format

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Update LLMConfig to include both model and small_model

Both parameters are needed for proper LLM configuration.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Address PR review feedback

- Remove flawed validation check in azure_openai_neo4j.py
- Remove unused azure-identity dependency
- Update docstrings to reflect dual client support (AsyncAzureOpenAI and AsyncOpenAI)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-11-14 08:34:35 -08:00
..
__init__.py Add support for falkordb (#575) 2025-06-13 12:06:57 -04:00
anthropic_client.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
azure_openai_client.py Add Azure OpenAI example with Neo4j (#1064) 2025-11-14 08:34:35 -08:00
client.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
config.py Gpt 5 default (#849) 2025-08-21 12:10:57 -04:00
errors.py Anthropic client (#361) 2025-04-16 12:35:07 -07:00
gemini_client.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
groq_client.py Refactor imports (#675) 2025-07-05 08:57:07 -07:00
openai_base_client.py feat: MCP Server v1.0.0 - Modular architecture with multi-provider support (#1024) 2025-10-30 22:59:01 -07:00
openai_client.py feat: MCP Server v1.0.0 - Modular architecture with multi-provider support (#1024) 2025-10-30 22:59:01 -07:00
openai_generic_client.py Use OpenAI structured output API for response validation (#1061) 2025-11-11 06:53:37 -08:00
utils.py update new names with input_data (#204) 2024-10-29 11:03:31 -04:00