- Set default max_tokens to 16384 (16K) for OpenAIGenericClient to better support local models - Add documentation note clarifying OpenAIGenericClient should be used for Ollama and LM Studio - Previous default was 8192 (8K) |
||
|---|---|---|
| .. | ||
| cross_encoder | ||
| driver | ||
| embedder | ||
| llm_client | ||
| migrations | ||
| models | ||
| prompts | ||
| search | ||
| telemetry | ||
| utils | ||
| __init__.py | ||
| decorators.py | ||
| edges.py | ||
| errors.py | ||
| graph_queries.py | ||
| graphiti.py | ||
| graphiti_types.py | ||
| helpers.py | ||
| nodes.py | ||
| py.typed | ||
| tracer.py | ||