- Add global --temperature command line argument with env fallback - Implement temperature priority for Ollama LLM binding: 1. --ollama-llm-temperature (highest) 2. OLLAMA_LLM_TEMPERATURE env var 3. --temperature command arg 4. TEMPERATURE env var (lowest) - Implement same priority logic for OpenAI/Azure OpenAI LLM binding - Ensure command line args always override environment variables - Maintain backward compatibility with existing configurations |
||
|---|---|---|
| .. | ||
| api | ||
| kg | ||
| llm | ||
| tools | ||
| __init__.py | ||
| base.py | ||
| constants.py | ||
| exceptions.py | ||
| lightrag.py | ||
| llm.py | ||
| namespace.py | ||
| operate.py | ||
| prompt.py | ||
| rerank.py | ||
| types.py | ||
| utils.py | ||
| utils_graph.py | ||