From 7ee6cc8eb8a05bd9ca64b64256c54fdbdd60dc9e Mon Sep 17 00:00:00 2001
From: David Myriel
Demo
.
- Learn more
+ Docs
+ .
+ Learn More
·
Join Discord
·
Join r/AIMemory
.
- Docs
- .
- cognee community repo
+ Integrations
- cognee - Memory for AI Agents in 6 lines of code
+ Cognee - Graph and Vector Memory for AI Agents
🌐 Available Languages
@@ -53,7 +50,7 @@ Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Ext
Deutsch |
Español |
- français |
+ Français |
日本語 |
한국어 |
Português |
@@ -69,67 +66,68 @@ Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Ext
-## Get Started
+## Quickstart
-Get started quickly with a Google Colab notebook , Deepnote notebook or starter repo
+- 🚀 Try it now on [Google Colab](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing)
+- 📓 Explore our [Deepnote Notebook](https://deepnote.com/workspace/cognee-382213d0-0444-4c89-8265-13770e333c02/project/cognee-demo-78ffacb9-5832-4611-bb1a-560386068b30/notebook/Notebook-1-75b24cda566d4c24ab348f7150792601?utm_source=share-modal&utm_medium=product-shared-content&utm_campaign=notebook&utm_content=78ffacb9-5832-4611-bb1a-560386068b30)
+- 🛠️ Clone our [Starter Repo](https://github.com/topoteretes/cognee/tree/main/cognee-starter-kit)
-## About cognee
+## About Cognee
-cognee works locally and stores your data on your device.
-Our hosted solution is just our deployment of OSS cognee on Modal, with the goal of making development and productionization easier.
+Cognee transforms your data into a living knowledge graph that learns from feedback and auto-tunes to deliver better answers over time.
-Self-hosted package:
+**Run anywhere:**
+- 🏠 **Self-Hosted**: Runs locally, data stays on your device
+- ☁️ **Cognee Cloud**: Same open-source Cognee, deployed on Modal for seamless workflows
-- Interconnects any kind of documents: past conversations, files, images, and audio transcriptions
-- Replaces RAG systems with a memory layer based on graphs and vectors
-- Reduces developer effort and cost, while increasing quality and precision
-- Provides Pythonic data pipelines that manage data ingestion from 30+ data sources
-- Is highly customizable with custom tasks, pipelines, and a set of built-in search endpoints
-
-Hosted platform:
-- Includes a managed UI and a [hosted solution](https://www.cognee.ai)
+**Self-Hosted Package:**
+- Unified memory for all your data sources
+- Domain-smart copilots that learn and adapt over time
+- Flexible memory architecture for AI agents and devices
+- Integrates easily with your current technology stack
+- Pythonic data pipelines supporting 30+ data sources out of the box
+- Fully extensible: customize tasks, pipelines, and search endpoints
+**Cognee Cloud:**
+- Get a managed UI and [Hosted Infrastructure](https://www.cognee.ai) with zero setup
## Self-Hosted (Open Source)
+Run Cognee on your stack. Cognee integrates easily with your current technologies. See our [integration guides](https://docs.cognee.ai/setup-configuration/overview).
+
### 📦 Installation
-You can install Cognee using either **pip**, **poetry**, **uv** or any other python package manager..
+Install Cognee with **pip**, **poetry**, **uv**, or your preferred Python package manager.
-Cognee supports Python 3.10 to 3.12
+**Requirements:** Python 3.10 to 3.12
-#### With uv
+#### Using uv
```bash
uv pip install cognee
```
-Detailed instructions can be found in our [docs](https://docs.cognee.ai/getting-started/installation#environment-configuration)
+For detailed setup instructions, see our [Documentation](https://docs.cognee.ai/getting-started/installation#environment-configuration).
-### 💻 Basic Usage
+### 💻 Usage
-#### Setup
+#### Configuration
-```
+```python
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
-
```
-You can also set the variables by creating .env file, using our template.
-To use different LLM providers, for more info check out our documentation
+Alternatively, create a `.env` file using our [template](https://github.com/topoteretes/cognee/blob/main/.env.template).
+To integrate other LLM providers, see our [LLM Provider Documentation](https://docs.cognee.ai/setup-configuration/llm-providers).
-#### Simple example
+#### Python Example
-
-
-##### Python
-
-This script will run the default pipeline:
+Run the default pipeline with this script:
```python
import cognee
@@ -163,9 +161,10 @@ Example output:
Cognee turns documents into AI memory.
```
-##### Via CLI
-Let's get the basics covered
+#### CLI Example
+
+Get started with these essential commands:
```
cognee-cli add "Cognee turns documents into AI memory."
@@ -176,51 +175,52 @@ cognee-cli search "What does cognee do?"
cognee-cli delete --all
```
-or run
+Or run:
```
cognee-cli -ui
```
+## Cognee Cloud
-
+Cognee is the fastest way to start building reliable AI agent memory. Deploy in minutes with automatic updates, analytics, and enterprise-grade security.
+
+- Sign up on [Cognee Cloud](https://www.cognee.ai)
+- Add your API key to local UI and sync your data to Cognee Cloud
+- Start building with managed infrastructure and zero configuration
+
+## Trusted in Production
+
+From regulated industries to startup stacks, Cognee is deployed in production and delivering value now. Read our [case studies](https://cognee.ai/blog) to learn more.
+
+## Demos & Examples
+
+See Cognee in action:
+
+### Cognee Cloud Beta Demo
+
+[Watch Demo](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0)
+
+### Simple GraphRAG Demo
+
+[Watch Demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f)
+
+### Cognee with Ollama
+
+[Watch Demo](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db)
-### Hosted Platform
+## Community & Support
-Get up and running in minutes with automatic updates, analytics, and enterprise security.
+### Contributing
+We welcome contributions from the community! Your input helps make Cognee better for everyone. See [`CONTRIBUTING.md`](CONTRIBUTING.md) to get started.
-1. Sign up on [cogwit](https://www.cognee.ai)
-2. Add your API key to local UI and sync your data to Cogwit
+### Code of Conduct
+We're committed to fostering an inclusive and respectful community. Read our [Code of Conduct](https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md) for guidelines.
+## Research & Citation
-
-## Demos
-
-1. Cogwit Beta demo:
-
-[Cogwit Beta](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0)
-
-2. Simple GraphRAG demo
-
-[Simple GraphRAG demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f)
-
-3. cognee with Ollama
-
-[cognee with local models](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db)
-
-
-## Contributing
-Your contributions are at the core of making this a true open source project. Any contributions you make are **greatly appreciated**. See [`CONTRIBUTING.md`](CONTRIBUTING.md) for more information.
-
-
-## Code of Conduct
-
-We are committed to making open source an enjoyable and respectful experience for our community. See
Demo
@@ -18,7 +18,7 @@
·
Join r/AIMemory
.
- Integrations
+ Community Plugins & Add-ons
CODE_OF_CONDUCT for more information.
-
-## Citation
-
-We now have a paper you can cite:
+Cite our research paper on optimizing knowledge graphs for LLM reasoning:
```bibtex
@misc{markovic2025optimizinginterfaceknowledgegraphs,
From 79f5201d6a7a5e9d44a25cc9919e660afce74ba1 Mon Sep 17 00:00:00 2001
From: David Myriel
- Cognee - Graph and Vector Memory for AI Agents
+ Cognee turns your data into memory for AI agents.
🌐 Available Languages
@@ -64,70 +63,65 @@ Persistent and accurate memory for AI agents. With Cognee, your AI agent underst
+## About Cognee
+Cognee is an open source tool and platform that transforms your raw data into intelligent, searchable AI memory for Agents. It combines vector search with graph databases to make your data both searchable by meaning and connected by relationships.
+
+You can use Cognee in two ways:
+
+1. [Self-host Cognee Open Source](), which stores all data locally by default.
+2. [Connect to Cognee Cloud]((https://platform.cognee.ai/)), and get the same OSS stack on managed infrastructure for easier development and productionization.
+
+Cognee Open Source (self-hosted):
+
+- Interconnects any type of data — including past conversations, files, images, and audio transcriptions
+- Replaces traditional RAG systems with a unified memory layer built on graphs and vectors
+- Reduces developer effort and infrastructure cost while improving quality and precision
+- Provides Pythonic data pipelines for ingestion from 30+ data sources
+- Offers high customizability through user-defined tasks, modular pipelines, and built-in search endpoints
+
+Cognee Cloud (managed):
+- Hosted web UI dashboard
+- Automatic version updates
+- Resource usage analytics
+- GDPR compliant, enterprise-grade security
+
+### Basic Usage & Feature Guide
+
+[Check out this short, end-to-end walkthrough]((https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing)) of Cognee's core features in Google Colab.
+
+[](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing)
## Quickstart
-- 🚀 Try it now on [Google Colab](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing)
-- 📓 Explore our [Deepnote Notebook](https://deepnote.com/workspace/cognee-382213d0-0444-4c89-8265-13770e333c02/project/cognee-demo-78ffacb9-5832-4611-bb1a-560386068b30/notebook/Notebook-1-75b24cda566d4c24ab348f7150792601?utm_source=share-modal&utm_medium=product-shared-content&utm_campaign=notebook&utm_content=78ffacb9-5832-4611-bb1a-560386068b30)
-- 🛠️ Clone our [Starter Repo](https://github.com/topoteretes/cognee/tree/main/cognee-starter-kit)
+Let’s take a quick look at how Cognee works in just a few lines of code. For detailed setup and configuration, see the [Cognee Docs](https://docs.cognee.ai/getting-started/installation#environment-configuration).
+### Prerequisites
-## About Cognee
+- Python 3.10 to 3.12
-Cognee transforms your data into a living knowledge graph that learns from feedback and auto-tunes to deliver better answers over time.
+### Step 1: Install Cognee
-**Run anywhere:**
-- 🏠 **Self-Hosted**: Runs locally, data stays on your device
-- ☁️ **Cognee Cloud**: Same open-source Cognee, deployed on Modal for seamless workflows
-
-**Self-Hosted Package:**
-
-- Unified memory for all your data sources
-- Domain-smart copilots that learn and adapt over time
-- Flexible memory architecture for AI agents and devices
-- Integrates easily with your current technology stack
-- Pythonic data pipelines supporting 30+ data sources out of the box
-- Fully extensible: customize tasks, pipelines, and search endpoints
-
-**Cognee Cloud:**
-- Get a managed UI and [Hosted Infrastructure](https://www.cognee.ai) with zero setup
-
-## Self-Hosted (Open Source)
-
-Run Cognee on your stack. Cognee integrates easily with your current technologies. See our [integration guides](https://docs.cognee.ai/setup-configuration/overview).
-
-
-### 📦 Installation
-
-Install Cognee with **pip**, **poetry**, **uv**, or your preferred Python package manager.
-
-**Requirements:** Python 3.10 to 3.12
-
-#### Using uv
+You can install Cognee with **pip**, **poetry**, **uv**, or your preferred Python package manager.
```bash
uv pip install cognee
```
-For detailed setup instructions, see our [Documentation](https://docs.cognee.ai/getting-started/installation#environment-configuration).
-
-### 💻 Usage
-
-#### Configuration
-
+### Step 2: Configure the LLM
```python
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
```
-
Alternatively, create a `.env` file using our [template](https://github.com/topoteretes/cognee/blob/main/.env.template).
+
To integrate other LLM providers, see our [LLM Provider Documentation](https://docs.cognee.ai/setup-configuration/llm-providers).
+### Step 3: Run the Pipeline
-#### Python Example
+Cognee will take your documents, generate a knowledge graph from them and then query the graph based on combined relationships.
-Run the default pipeline with this script:
+Let's run a minimal pipeline:
```python
import cognee
@@ -145,7 +139,7 @@ async def main():
await cognee.memify()
# Query the knowledge graph
- results = await cognee.search("What does cognee do?")
+ results = await cognee.search("What does Cognee do?")
# Display the results
for result in results:
@@ -156,42 +150,31 @@ if __name__ == '__main__':
asyncio.run(main())
```
+
Example output:
```
Cognee turns documents into AI memory.
-
```
-#### CLI Example
+### Use the Cognee CLI
-Get started with these essential commands:
+As an alternative, you can get started with these essential commands:
```
cognee-cli add "Cognee turns documents into AI memory."
cognee-cli cognify
-cognee-cli search "What does cognee do?"
+cognee-cli search "What does Cognee do?"
cognee-cli delete --all
```
-Or run:
+
+To open the local UI, run:
```
cognee-cli -ui
```
-## Cognee Cloud
-
-Cognee is the fastest way to start building reliable AI agent memory. Deploy in minutes with automatic updates, analytics, and enterprise-grade security.
-
-- Sign up on [Cognee Cloud](https://www.cognee.ai)
-- Add your API key to local UI and sync your data to Cognee Cloud
-- Start building with managed infrastructure and zero configuration
-
-## Trusted in Production
-
-From regulated industries to startup stacks, Cognee is deployed in production and delivering value now. Read our [case studies](https://cognee.ai/blog) to learn more.
-
## Demos & Examples
See Cognee in action:
@@ -220,7 +203,7 @@ We're committed to fostering an inclusive and respectful community. Read our [Co
## Research & Citation
-Cite our research paper on optimizing knowledge graphs for LLM reasoning:
+We recently published a research paper on optimizing knowledge graphs for LLM reasoning:
```bibtex
@misc{markovic2025optimizinginterfaceknowledgegraphs,
From a068f3536be293f575e21a11ec032c79079b8478 Mon Sep 17 00:00:00 2001
From: David Myriel
Demo
@@ -65,7 +65,7 @@ Use your data to build personalized and dynamic memory for AI Agents. Cognee let
## About Cognee
-Cognee is an open source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your data both searchable by meaning and connected by relationships.
+Cognee is an open source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your documents both searchable by meaning and connected by relationships.
You can use Cognee in two ways:
From 4e03406cb6a0e43c72dad6eac510121cd7e9aa72 Mon Sep 17 00:00:00 2001
From: David Myriel
- Cognee turns your data into memory for AI agents.
+ Cognee - Accurate & Persistent Memory for AI agents.
- Cognee - Accurate & Persistent Memory for AI agents.
+ Cognee - Accurate and Persistent AI Memory