### What problem does this PR solve? fix issue with `keep_alive=-1` for ollama chat model by allowing a user to set an additional configuration option. It is no-breaking change because it still uses a previous default value such as: `keep_alive=-1` ### Type of change - [x] Bug Fix (non-breaking change which fixes an issue) - [X] Performance Improvement - [X] Other (please describe): - Additional configuration option has been added to control behavior of RAGFlow while working with ollama LLM |
||
|---|---|---|
| .. | ||
| app | ||
| llm | ||
| nlp | ||
| prompts | ||
| res | ||
| svr | ||
| utils | ||
| __init__.py | ||
| benchmark.py | ||
| prompt_template.py | ||
| prompts.py | ||
| raptor.py | ||
| settings.py | ||