ragflow/rag/llm
Stephen Hu e9cbf4611d
Fix:Error when parsing files using Gemini: **ERROR**: GENERIC_ERROR - Unknown field for GenerationConfig: max_tokens (#9195)
### What problem does this PR solve?
https://github.com/infiniflow/ragflow/issues/9177
The reason should be due to the gemin internal use a different parameter
name
`
        max_output_tokens (int):
            Optional. The maximum number of tokens to include in a
            response candidate.

            Note: The default value varies by model, see the
            ``Model.output_token_limit`` attribute of the ``Model``
            returned from the ``getModel`` function.

            This field is a member of `oneof`_ ``_max_output_tokens``.
`
### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
2025-08-04 10:06:09 +08:00
..
__init__.py Refa: automatic LLMs registration (#8651) 2025-07-03 19:05:31 +08:00
chat_model.py Fix:Error when parsing files using Gemini: **ERROR**: GENERIC_ERROR - Unknown field for GenerationConfig: max_tokens (#9195) 2025-08-04 10:06:09 +08:00
cv_model.py Refactor:Introduce Image Close For GeminiCV (#9147) 2025-08-01 12:38:13 +08:00
embedding_model.py Feat/support 302ai provider (#8742) 2025-07-31 14:48:30 +08:00
rerank_model.py Feat/support 302ai provider (#8742) 2025-07-31 14:48:30 +08:00
sequence2txt_model.py Feat/support 302ai provider (#8742) 2025-07-31 14:48:30 +08:00
tts_model.py Feat/support 302ai provider (#8742) 2025-07-31 14:48:30 +08:00