cognee/cognee-mcp/README.md
Hande 8b938b09f0
fix: update README.md (#1017)
<!-- .github/pull_request_template.md -->

## Description
<!-- Provide a clear description of the changes in this PR -->

## DCO Affirmation
I affirm that all code in every commit of this pull request conforms to
the terms of the Topoteretes Developer Certificate of Origin.
2025-06-24 13:38:01 +02:00

7.7 KiB
Raw Blame History

Cognee Logo

cogneemcp - Run cognees memory engine as a Model Context Protocol server

Demo . Learn more · Join Discord · Join r/AIMemory

GitHub forks GitHub stars GitHub commits Github tag Downloads License Contributors

cognee - Memory for AI Agents  in 5 lines of code | Product Hunt

topoteretes%2Fcognee | Trendshift

Build memory for Agents and query from any client that speaks MCP  in your terminal or IDE.

Features

  • SSE & stdio transports choose realtime streaming --transport sse or the classic stdio pipe
  • Integrated logging all actions written to a rotating file (see get_log_file_location()) and mirrored to console in dev
  • Local file ingestion feed .md, source files, Cursor rulesets, etc. straight from disk
  • Background pipelines longrunning cognify & codify jobs spawn offthread; check progress with status tools
  • Developer rules bootstrap one call indexes .cursorrules, .cursor/rules, AGENT.md, and friends into the developer_rules nodeset
  • Prune & reset wipe memory clean with a single prune call when you want to start fresh

Please refer to our documentation here for further information.

🚀 Quick Start

  1. Clone cognee repo
    git clone https://github.com/topoteretes/cognee.git
    
  2. Navigate to cognee-mcp subdirectory
    cd cognee/cognee-mcp
    
  3. Install uv if you don't have one
    pip install uv
    
  4. Install all the dependencies you need for cognee mcp server with uv
    uv sync --dev --all-extras --reinstall
    
  5. Activate the virtual environment in cognee mcp directory
    source .venv/bin/activate
    
  6. Set up your OpenAI API key in .env for a quick setup with the default cognee configurations
    LLM_API_KEY="YOUR_OPENAI_API_KEY"
    
  7. Run cognee mcp server with stdio (default)
    python src/server.py
    
    or stream responses over SSE
    python src/server.py --transport sse
    

You can do more advanced configurations by creating .env file using our template. To use different LLM providers / database configurations, and for more info check out our documentation.

💻 Basic Usage

The MCP server exposes its functionality through tools. Call them from any MCP client (Cursor, Claude Desktop, Cline, Roo and more).

Available Tools

  • cognify: Turns your data into a structured knowledge graph and stores it in memory

  • codify: Analyse a code repository, build a code graph, stores it in memory

  • search: Query memory supports GRAPH_COMPLETION, RAG_COMPLETION, CODE, CHUNKS, INSIGHTS

  • prune: Reset cognee for a fresh start

  • cognify_status / codify_status: Track pipeline progress

Remember  use the CODE search type to query your code graph. For huge repos, run codify on modules incrementally and cache results.

IDE Example: Cursor

  1. After you run the server as described in the Quick Start, create a run script for cognee. Here is a simple example:

    #!/bin/bash
    export ENV=local
    export TOKENIZERS_PARALLELISM=false
    export EMBEDDING_PROVIDER="fastembed"
    export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
    export EMBEDDING_DIMENSIONS=384
    export EMBEDDING_MAX_TOKENS=256
    export LLM_API_KEY=your-OpenAI-API-key
    uv --directory /{cognee_root_path}/cognee-mcp run cognee
    

    Remember to replace your-OpenAI-API-key and {cognee_root_path} with correct values.

  2. Install Cursor and navigate to Settings → MCP Tools → New MCP Server

  3. Cursor will open mcp.json file in a new tab. Configure your cognee MCP server by copy-pasting the following:

    {
      "mcpServers": {
        "cognee": {
          "command": "sh",
          "args": [
            "/{path-to-your-script}/run-cognee.sh"
          ]
        }
      }
    }
    

    Remember to replace {path-to-your-script} with the correct value of the path of the script you created in the first step.

That's it! You can refresh the server from the toggle next to your new cognee server. Check the green dot and the available tools to verify your server is running.

Now you can open your Cursor Agent and start using cognee tools from it via prompting.

Development and Debugging

Debugging

To use debugger, run: bash mcp dev src/server.py

Open inspector with timeout passed: http://localhost:5173?timeout=120000

To apply new changes while developing cognee you need to do:

  1. poetry lock in cognee folder
  2. uv sync --dev --all-extras --reinstall
  3. mcp dev src/server.py

Development

In order to use local cognee:

  1. Uncomment the following line in the cognee-mcp pyproject.toml file and set the cognee root path.

    #"cognee[postgres,codegraph,gemini,huggingface,docs,neo4j] @ file:/Users/<username>/Desktop/cognee"
    

    Remember to replace file:/Users/<username>/Desktop/cognee with your actual cognee root path.

  2. Install dependencies with uv in the mcp folder

    uv sync --reinstall
    

Code of Conduct

We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.

💫 Contributors

contributors

Star History

Star History Chart