graphiti/graphiti_core/llm_client
alan blount 432ff7577d
feat(gemini): simplify config for Gemini clients (#679)
The cross_encoder for Gemini already supported passing in a custom client.

I replicated the same input pattern to embedder and llm_client.

The value is, you can support custom API endpoints and other options like below:

        cross_encoder=GeminiRerankerClient(
            client=genai.Client(
                api_key=os.environ.get('GOOGLE_GENAI_API_KEY'),
                http_options=types.HttpOptions(api_version='v1alpha')),
            config=LLMConfig(
                model="gemini-2.5-flash-lite-preview-06-17"
            )
        ))
2025-07-05 21:14:55 -07:00
..
__init__.py Add support for falkordb (#575) 2025-06-13 12:06:57 -04:00
anthropic_client.py Refactor imports (#675) 2025-07-05 08:57:07 -07:00
azure_openai_client.py Azure OpenAI improvements and fixes; Improve Graphiti Azure OpenAI config (#620) 2025-06-25 14:48:12 -04:00
client.py small model fix (#432) 2025-05-02 10:08:25 -04:00
config.py small model fix (#432) 2025-05-02 10:08:25 -04:00
errors.py Anthropic client (#361) 2025-04-16 12:35:07 -07:00
gemini_client.py feat(gemini): simplify config for Gemini clients (#679) 2025-07-05 21:14:55 -07:00
groq_client.py Refactor imports (#675) 2025-07-05 08:57:07 -07:00
openai_base_client.py Azure OpenAI improvements and fixes; Improve Graphiti Azure OpenAI config (#620) 2025-06-25 14:48:12 -04:00
openai_client.py Azure OpenAI improvements and fixes; Improve Graphiti Azure OpenAI config (#620) 2025-06-25 14:48:12 -04:00
openai_generic_client.py small model fix (#432) 2025-05-02 10:08:25 -04:00
utils.py update new names with input_data (#204) 2024-10-29 11:03:31 -04:00