### What problem does this PR solve? fix issue with `keep_alive=-1` for ollama chat model by allowing a user to set an additional configuration option. It is no-breaking change because it still uses a previous default value such as: `keep_alive=-1` ### Type of change - [x] Bug Fix (non-breaking change which fixes an issue) - [X] Performance Improvement - [X] Other (please describe): - Additional configuration option has been added to control behavior of RAGFlow while working with ollama LLM |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| chat_model.py | ||
| cv_model.py | ||
| embedding_model.py | ||
| rerank_model.py | ||
| sequence2txt_model.py | ||
| tts_model.py | ||