chore: Update MCP server configuration and documentation

Updates MCP server factories, pyproject.toml, and README.md to improve configuration handling and documentation clarity.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Daniel Chalef 2025-08-25 16:38:28 -07:00
parent 2802f98e84
commit 40a570c957
4 changed files with 28 additions and 6 deletions

View file

@ -89,7 +89,7 @@ async def build_indices_and_constraints(driver: GraphDriver, delete_existing: bo
async def clear_data(driver: GraphDriver, group_ids: list[str] | None = None):
async with driver.session() as session:
with driver.session() as session:
async def delete_all(tx):
await tx.run('MATCH (n) DETACH DELETE n')

View file

@ -97,6 +97,24 @@ database:
provider: "neo4j" # or "falkordb" (requires additional setup)
```
### Using Ollama for Local LLM
To use Ollama with the MCP server, configure it as an OpenAI-compatible endpoint:
```yaml
llm:
provider: "openai"
model: "llama3.2" # or your preferred Ollama model
api_base: "http://localhost:11434/v1"
api_key: "ollama" # dummy key required
embedder:
provider: "sentence_transformers" # recommended for local setup
model: "all-MiniLM-L6-v2"
```
Make sure Ollama is running locally with: `ollama serve`
### Environment Variables
The `config.yaml` file supports environment variable expansion using `${VAR_NAME}` or `${VAR_NAME:default}` syntax. Key variables:

View file

@ -25,14 +25,14 @@ except ImportError:
HAS_AZURE_EMBEDDER = False
try:
from graphiti_core.embedder import GeminiEmbedder
from graphiti_core.embedder.gemini_embedder import GeminiEmbedder
HAS_GEMINI_EMBEDDER = True
except ImportError:
HAS_GEMINI_EMBEDDER = False
try:
from graphiti_core.embedder import VoyageEmbedder
from graphiti_core.embedder.voyage_embedder import VoyageEmbedder
HAS_VOYAGE_EMBEDDER = True
except ImportError:
@ -46,21 +46,21 @@ except ImportError:
HAS_AZURE_LLM = False
try:
from graphiti_core.llm_client import AnthropicClient
from graphiti_core.llm_client.anthropic_client import AnthropicClient
HAS_ANTHROPIC = True
except ImportError:
HAS_ANTHROPIC = False
try:
from graphiti_core.llm_client import GeminiClient
from graphiti_core.llm_client.gemini_client import GeminiClient
HAS_GEMINI = True
except ImportError:
HAS_GEMINI = False
try:
from graphiti_core.llm_client import GroqClient
from graphiti_core.llm_client.groq_client import GroqClient
HAS_GROQ = True
except ImportError:

View file

@ -11,6 +11,10 @@ dependencies = [
"azure-identity>=1.21.0",
"pydantic-settings>=2.0.0",
"pyyaml>=6.0",
"google-genai>=1.8.0",
"anthropic>=0.49.0",
"groq>=0.2.0",
"voyageai>=0.2.3",
]
[dependency-groups]