ragflow/rag
Viktor Dmitriyev b47dcc9108
Fix issue with keep_alive=-1 for ollama chat model by allowing a user to set an additional configuration option (#9017)
### What problem does this PR solve?

fix issue with `keep_alive=-1` for ollama chat model by allowing a user
to set an additional configuration option. It is no-breaking change
because it still uses a previous default value such as: `keep_alive=-1`

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
- [X] Performance Improvement
- [X] Other (please describe):
- Additional configuration option has been added to control behavior of
RAGFlow while working with ollama LLM
2025-07-24 11:20:14 +08:00
..
app Fix: when parse markdown support extract image at local (#8906) 2025-07-18 17:06:58 +08:00
llm Fix issue with keep_alive=-1 for ollama chat model by allowing a user to set an additional configuration option (#9017) 2025-07-24 11:20:14 +08:00
nlp Fix: no chunks parsed out for Law (#8842) 2025-07-15 13:01:56 +08:00
prompts Refa: refactor prompts into markdown-style structure using Jinja2 (#8667) 2025-07-04 15:59:41 +08:00
res Perf: ignore concate between rows. (#8507) 2025-06-26 14:55:37 +08:00
svr Refa: remove temperature since some LLMs fail to support. (#8981) 2025-07-23 10:17:04 +08:00
utils use quote_plus to escape password in opendal's mysql url (#8976) 2025-07-23 10:17:34 +08:00
__init__.py Update comments (#4569) 2025-01-21 20:52:28 +08:00
benchmark.py Refactor embedding batch_size (#3825) 2024-12-03 16:22:39 +08:00
prompt_template.py Refa: refactor prompts into markdown-style structure using Jinja2 (#8667) 2025-07-04 15:59:41 +08:00
prompts.py Perf: set timeout of some steps in KG. (#8873) 2025-07-16 18:06:03 +08:00
raptor.py Refa: remove temperature since some LLMs fail to support. (#8981) 2025-07-23 10:17:04 +08:00
settings.py Feat: make document parsing and embedding batch sizes configurable via environment variables (#8266) 2025-06-16 13:40:47 +08:00