ragflow/rag/llm
Stephen Hu a1f848bfe0
Fix:max_tokens must be at least 1, got -950, BadRequestError (#10252)
### What problem does this PR solve?
https://github.com/infiniflow/ragflow/issues/10235

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
- [x] Refactoring
2025-09-24 10:49:34 +08:00
..
__init__.py Feat: add support for Anthropic third-party API (#10173) 2025-09-19 19:06:14 +08:00
chat_model.py Fix:max_tokens must be at least 1, got -950, BadRequestError (#10252) 2025-09-24 10:49:34 +08:00
cv_model.py revert gpt5 integration (#10228) 2025-09-23 16:06:12 +08:00
embedding_model.py Refactor: use the same implement for total token count from res (#10197) 2025-09-22 17:17:06 +08:00
rerank_model.py Refactor: use the same implement for total token count from res (#10197) 2025-09-22 17:17:06 +08:00
sequence2txt_model.py Feat: add CometAPI to LLMFactory and update related mappings (#10119) 2025-09-18 09:51:29 +08:00
tts_model.py Feat: add CometAPI to LLMFactory and update related mappings (#10119) 2025-09-18 09:51:29 +08:00