docs: add note on LLM service compatibility in README.md (#359)
Enhance README with important note on LLM service compatibility. Clarified that Graphiti works best with LLMs supporting Structured Output, highlighting potential issues with smaller models.
This commit is contained in:
parent
c19f9d09d3
commit
31a4bfeeb2
1 changed files with 5 additions and 0 deletions
|
|
@ -94,6 +94,11 @@ Requirements:
|
|||
- Neo4j 5.26 or higher (serves as the embeddings storage backend)
|
||||
- OpenAI API key (for LLM inference and embedding)
|
||||
|
||||
> [!IMPORTANT]
|
||||
> Graphiti works best with LLM services that support Structured Output (such as OpenAI and Gemini).
|
||||
> Using other services may result in incorrect output schemas and ingestion failures. This is particularly
|
||||
> problematic when using smaller models.
|
||||
|
||||
Optional:
|
||||
|
||||
- Google Gemini, Anthropic, or Groq API key (for alternative LLM providers)
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue