graphiti/server
facronactz a7f2c92bb4 feat: Add support for Chutes API integration
- Introduced CHUTES_API_TOKEN in .env.example files for configuration.
- Updated README.md in quickstart example to include Chutes API setup.
- Created quickstart_chutes.py example for demonstrating Chutes integration.
- Added ChutesEmbedder and ChutesClient classes for embedding and LLM functionalities.
- Updated LLM and embedder provider configurations to support Chutes.
- Enhanced factories to include Chutes client and embedder.
- Modified schema.py to define Chutes provider configuration.
- Updated mcp_server configuration files to integrate Chutes API.
- Added necessary dependencies in pyproject.toml for Chutes support.
2025-11-15 16:12:09 +07:00
..
graph_service migrate to pyright (#646) 2025-06-30 12:04:21 -07:00
.env.example feat: Add support for Chutes API integration 2025-11-15 16:12:09 +07:00
Makefile migrate to pyright (#646) 2025-06-30 12:04:21 -07:00
pyproject.toml migrate to pyright (#646) 2025-06-30 12:04:21 -07:00
README.md Add automated FastAPI server container release workflow (#1031) 2025-10-29 19:24:12 -07:00
uv.lock migrate to pyright (#646) 2025-06-30 12:04:21 -07:00

graph-service

Graph service is a fast api server implementing the graphiti package.

Container Releases

The FastAPI server container is automatically built and published to Docker Hub when a new graphiti-core version is released to PyPI.

Image: zepai/graphiti

Available tags:

  • latest - Latest stable release
  • 0.22.1 - Specific version (matches graphiti-core version)

Platforms: linux/amd64, linux/arm64

The automated release workflow:

  1. Triggers when graphiti-core PyPI release completes
  2. Waits for PyPI package availability
  3. Builds multi-platform Docker image
  4. Tags with version number and latest
  5. Pushes to Docker Hub

Only stable releases are built automatically (pre-release versions are skipped).

Running Instructions

  1. Ensure you have Docker and Docker Compose installed on your system.

  2. Add zepai/graphiti:latest to your service setup

  3. Make sure to pass the following environment variables to the service

    OPENAI_API_KEY=your_openai_api_key
    NEO4J_USER=your_neo4j_user
    NEO4J_PASSWORD=your_neo4j_password
    NEO4J_PORT=your_neo4j_port
    
  4. This service depends on having access to a neo4j instance, you may wish to add a neo4j image to your service setup as well. Or you may wish to use neo4j cloud or a desktop version if running this locally.

    An example of docker compose setup may look like this:

       version: '3.8'
    
       services:
       graph:
          image: zepai/graphiti:latest
          ports:
             - "8000:8000"
    
          environment:
             - OPENAI_API_KEY=${OPENAI_API_KEY}
             - NEO4J_URI=bolt://neo4j:${NEO4J_PORT}
             - NEO4J_USER=${NEO4J_USER}
             - NEO4J_PASSWORD=${NEO4J_PASSWORD}
       neo4j:
          image: neo4j:5.22.0
    
          ports:
             - "7474:7474"  # HTTP
             - "${NEO4J_PORT}:${NEO4J_PORT}"  # Bolt
          volumes:
             - neo4j_data:/data
          environment:
             - NEO4J_AUTH=${NEO4J_USER}/${NEO4J_PASSWORD}
    
       volumes:
       neo4j_data:
    
  5. Once you start the service, it will be available at http://localhost:8000 (or the port you have specified in the docker compose file).

  6. You may access the swagger docs at http://localhost:8000/docs. You may also access redocs at http://localhost:8000/redoc.

  7. You may also access the neo4j browser at http://localhost:7474 (the port depends on the neo4j instance you are using).