diff --git a/cognee-mcp/README.md b/cognee-mcp/README.md
index ba7ba2a4d..a5dee68d8 100644
--- a/cognee-mcp/README.md
+++ b/cognee-mcp/README.md
@@ -1,98 +1,159 @@
-# cognee MCP server
+
+
+
+
+
+
+
+ cognee‑mcp - Run cognee’s memory engine as a Model Context Protocol server
+
+
+ Demo
+ .
+ Learn more
+ ·
+ Join Discord
+ ·
+ Join r/AIMemory
+
+
+
+ [](https://GitHub.com/topoteretes/cognee/network/)
+ [](https://GitHub.com/topoteretes/cognee/stargazers/)
+ [](https://GitHub.com/topoteretes/cognee/commit/)
+ [](https://github.com/topoteretes/cognee/tags/)
+ [](https://pepy.tech/project/cognee)
+ [](https://github.com/topoteretes/cognee/blob/main/LICENSE)
+ [](https://github.com/topoteretes/cognee/graphs/contributors)
+
+

+
+

+
+
+Build memory for Agents and query from any client that speaks MCP – in your terminal or IDE.
+
+
+
+## ✨ Features
+
+- SSE & stdio transports – choose real‑time streaming --transport sse or the classic stdio pipe
+- Integrated logging – all actions written to a rotating file (see get_log_file_location()) and mirrored to console in dev
+- Local file ingestion – feed .md, source files, Cursor rule‑sets, etc. straight from disk
+- Background pipelines – long‑running cognify & codify jobs spawn off‑thread; check progress with status tools
+- Developer rules bootstrap – one call indexes .cursorrules, .cursor/rules, AGENT.md, and friends into the developer_rules nodeset
+- Prune & reset – wipe memory clean with a single prune call when you want to start fresh
Please refer to our documentation [here](https://docs.cognee.ai/how-to-guides/deployment/mcp) for further information.
-### Installing Manually
-A MCP server project
-=======
-1. Clone the [cognee](https://github.com/topoteretes/cognee) repo
+## 🚀 Quick Start
-2. Install dependencies
+1. Clone cognee repo
+ ```
+ git clone https://github.com/topoteretes/cognee.git
+ ```
+2. Navigate to cognee-mcp subdirectory
+ ```
+ cd cognee/cognee-mcp
+ ```
+3. Install uv if you don't have one
+ ```
+ brew install uv
+ ```
+4. Install all the dependencies you need for cognee mcp server with uv
+ ```
+ uv sync --dev --all-extras --reinstall
+ ```
+5. Activate the virtual environment in cognee mcp directory
+ ```
+ source .venv/bin/activate
+ ```
+6. Set up your OpenAI API key in .env for a quick setup with the default cognee configurations
+ ```
+ LLM_API_KEY="YOUR_OPENAI_API_KEY"
+ ```
+7. Run cognee mcp server with stdio (default)
+ ```
+ python src/server.py
+ ```
+ or stream responses over SSE
+ ```
+ python src/server.py --transport sse
+ ```
-```
-brew install uv
-```
+You can do more advanced configurations by creating .env file using our