graphiti/graphiti_core/llm_client
Galleons2029 998e60baf8 feat(openai): add JSON-schema fallback for structured outputs
Prefer responses.parse for structured parsing; on clear non-support
(404/NotFound/AttributeError or error mentioning “responses”), fall back
to chat.completions.create with response_format: {type: "json_schema"}
Build JSON Schema from Pydantic v2 (model_json_schema) with v1
(schema) fallback
Preserve reasoning-model temperature behavior (gpt-5/o1/o3) in both
primary and fallback paths
Normalize provider output to a JSON string and wrap with a minimal
response exposing .output_text
Update imports and minor lint fixes
Motivation: Improve compatibility with OpenAI-compatible providers that lack
/v1/responses while keeping the native OpenAI path unchanged.

Notes: No breaking changes; existing tests pass.
2025-11-06 16:11:08 +08:00
..
__init__.py Add support for falkordb (#575) 2025-06-13 12:06:57 -04:00
anthropic_client.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
azure_openai_client.py Fix Azure structured completions (#1039) 2025-11-01 18:40:43 -07:00
client.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
config.py Gpt 5 default (#849) 2025-08-21 12:10:57 -04:00
errors.py Anthropic client (#361) 2025-04-16 12:35:07 -07:00
gemini_client.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
groq_client.py Refactor imports (#675) 2025-07-05 08:57:07 -07:00
openai_base_client.py feat: MCP Server v1.0.0 - Modular architecture with multi-provider support (#1024) 2025-10-30 22:59:01 -07:00
openai_client.py feat(openai): add JSON-schema fallback for structured outputs 2025-11-06 16:11:08 +08:00
openai_generic_client.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
utils.py update new names with input_data (#204) 2024-10-29 11:03:31 -04:00