LightRAG/lightrag/llm
yangdx 02fdceb959 Update OpenAI client to use stable API and bump minimum version to 2.0.0
- Remove beta prefix from completions.parse
- Update OpenAI dependency to >=2.0.0
- Fix whitespace formatting
- Update all requirement files
- Clean up pyproject.toml dependencies
2025-11-21 12:55:44 +08:00
..
deprecated Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
__init__.py Separated llms from the main llm.py file and fixed some deprication bugs 2025-01-25 00:11:00 +01:00
anthropic.py Add Deepseek Style Chain of Thought (CoT) Support for OpenAI Compatible LLM providers 2025-09-09 22:34:36 +08:00
azure_openai.py Update OpenAI client to use stable API and bump minimum version to 2.0.0 2025-11-21 12:55:44 +08:00
bedrock.py Improve Bedrock error handling with retry logic and custom exceptions 2025-11-14 18:51:41 +08:00
binding_options.py Add Gemini embedding support 2025-11-08 03:34:30 +08:00
gemini.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
hf.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
jina.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
llama_index_impl.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
lmdeploy.py Add Deepseek Style Chain of Thought (CoT) Support for OpenAI Compatible LLM providers 2025-09-09 22:34:36 +08:00
lollms.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
nvidia_openai.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
ollama.py Add max_token_size parameter to embedding function decorators 2025-11-14 18:41:43 +08:00
openai.py Update OpenAI client to use stable API and bump minimum version to 2.0.0 2025-11-21 12:55:44 +08:00
zhipu.py Add Deepseek Style Chain of Thought (CoT) Support for OpenAI Compatible LLM providers 2025-09-09 22:34:36 +08:00