ragflow/rag/llm
Viktor Dmitriyev b47dcc9108
Fix issue with keep_alive=-1 for ollama chat model by allowing a user to set an additional configuration option (#9017)
### What problem does this PR solve?

fix issue with `keep_alive=-1` for ollama chat model by allowing a user
to set an additional configuration option. It is no-breaking change
because it still uses a previous default value such as: `keep_alive=-1`

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
- [X] Performance Improvement
- [X] Other (please describe):
- Additional configuration option has been added to control behavior of
RAGFlow while working with ollama LLM
2025-07-24 11:20:14 +08:00
..
__init__.py Refa: automatic LLMs registration (#8651) 2025-07-03 19:05:31 +08:00
chat_model.py Fix issue with keep_alive=-1 for ollama chat model by allowing a user to set an additional configuration option (#9017) 2025-07-24 11:20:14 +08:00
cv_model.py Fix: Wrong_Input_type_for_Gemin (#8783) 2025-07-11 11:34:04 +08:00
embedding_model.py Fix: typo Bearer token (#8998) 2025-07-23 18:10:51 +08:00
rerank_model.py Fix:Improve float operation when rerank (#8963) 2025-07-22 10:04:00 +08:00
sequence2txt_model.py Feat: add model provider DeepInfra (#9003) 2025-07-23 18:10:35 +08:00
tts_model.py Fix: typo Bearer token (#8998) 2025-07-23 18:10:51 +08:00