diff --git a/mcp_server/.env.example b/mcp_server/.env.example index b184f4e1..9decb6c7 100644 --- a/mcp_server/.env.example +++ b/mcp_server/.env.example @@ -20,6 +20,9 @@ FALKORDB_PASSWORD= OPENAI_API_KEY=your_openai_api_key_here MODEL_NAME=gpt-4.1-mini +# Optional: OpenAI model name to use for embedding operations. +# EMBEDDER_MODEL_NAME=text-embedding-3-small + # Optional: Only needed for non-standard OpenAI endpoints # OPENAI_BASE_URL=https://api.openai.com/v1 diff --git a/mcp_server/README.md b/mcp_server/README.md index e43dc1d5..3024a1c4 100644 --- a/mcp_server/README.md +++ b/mcp_server/README.md @@ -101,6 +101,7 @@ The server supports both Neo4j and FalkorDB as database backends. Use the `DATAB - `OPENAI_BASE_URL`: Optional base URL for OpenAI API - `MODEL_NAME`: OpenAI model name to use for LLM operations. - `SMALL_MODEL_NAME`: OpenAI model name to use for smaller LLM operations. +- `EMBEDDER_MODEL_NAME`: OpenAI model name to use for embedding operations. - `LLM_TEMPERATURE`: Temperature for LLM responses (0.0-2.0). - `AZURE_OPENAI_ENDPOINT`: Optional Azure OpenAI LLM endpoint URL - `AZURE_OPENAI_DEPLOYMENT_NAME`: Optional Azure OpenAI LLM deployment name