This commit is contained in:
David Myriel 2025-10-30 16:21:41 +01:00
parent 7743071c51
commit 7ee6cc8eb8

152
README.md
View file

@ -5,27 +5,27 @@
<br />
cognee - Memory for AI Agents in 6 lines of code
Cognee - Graph and Vector Memory for AI Agents
<p align="center">
<a href="https://www.youtube.com/watch?v=1bezuvLwJmw&t=2s">Demo</a>
.
<a href="https://cognee.ai">Learn more</a>
<a href="https://docs.cognee.ai/">Docs</a>
.
<a href="https://cognee.ai">Learn More</a>
·
<a href="https://discord.gg/NQPKmU5CCg">Join Discord</a>
·
<a href="https://www.reddit.com/r/AIMemory/">Join r/AIMemory</a>
.
<a href="https://docs.cognee.ai/">Docs</a>
.
<a href="https://github.com/topoteretes/cognee-community">cognee community repo</a>
<a href="https://github.com/topoteretes/cognee-community">Integrations</a>
</p>
[![GitHub forks](https://img.shields.io/github/forks/topoteretes/cognee.svg?style=social&label=Fork&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/network/)
[![GitHub stars](https://img.shields.io/github/stars/topoteretes/cognee.svg?style=social&label=Star&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/stargazers/)
[![GitHub commits](https://badgen.net/github/commits/topoteretes/cognee)](https://GitHub.com/topoteretes/cognee/commit/)
[![Github tag](https://badgen.net/github/tag/topoteretes/cognee)](https://github.com/topoteretes/cognee/tags/)
[![GitHub tag](https://badgen.net/github/tag/topoteretes/cognee)](https://github.com/topoteretes/cognee/tags/)
[![Downloads](https://static.pepy.tech/badge/cognee)](https://pepy.tech/project/cognee)
[![License](https://img.shields.io/github/license/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/blob/main/LICENSE)
[![Contributors](https://img.shields.io/github/contributors/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/graphs/contributors)
@ -42,10 +42,7 @@
</p>
Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Extract, Cognify, Load) pipelines.
Persistent and accurate memory for AI agents. With Cognee, your AI agent understands, reasons, and adapts.
<p align="center">
🌐 Available Languages
@ -53,7 +50,7 @@ Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Ext
<!-- Keep these links. Translations will automatically update with the README. -->
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=de">Deutsch</a> |
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=es">Español</a> |
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=fr">français</a> |
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=fr">Français</a> |
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=ja">日本語</a> |
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=ko">한국어</a> |
<a href="https://www.readme-i18n.com/topoteretes/cognee?lang=pt">Português</a> |
@ -69,67 +66,68 @@ Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Ext
## Get Started
## Quickstart
Get started quickly with a Google Colab <a href="https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing">notebook</a> , <a href="https://deepnote.com/workspace/cognee-382213d0-0444-4c89-8265-13770e333c02/project/cognee-demo-78ffacb9-5832-4611-bb1a-560386068b30/notebook/Notebook-1-75b24cda566d4c24ab348f7150792601?utm_source=share-modal&utm_medium=product-shared-content&utm_campaign=notebook&utm_content=78ffacb9-5832-4611-bb1a-560386068b30">Deepnote notebook</a> or <a href="https://github.com/topoteretes/cognee/tree/main/cognee-starter-kit">starter repo</a>
- 🚀 Try it now on [Google Colab](https://colab.research.google.com/drive/12Vi9zID-M3fpKpKiaqDBvkk98ElkRPWy?usp=sharing)
- 📓 Explore our [Deepnote Notebook](https://deepnote.com/workspace/cognee-382213d0-0444-4c89-8265-13770e333c02/project/cognee-demo-78ffacb9-5832-4611-bb1a-560386068b30/notebook/Notebook-1-75b24cda566d4c24ab348f7150792601?utm_source=share-modal&utm_medium=product-shared-content&utm_campaign=notebook&utm_content=78ffacb9-5832-4611-bb1a-560386068b30)
- 🛠️ Clone our [Starter Repo](https://github.com/topoteretes/cognee/tree/main/cognee-starter-kit)
## About cognee
## About Cognee
cognee works locally and stores your data on your device.
Our hosted solution is just our deployment of OSS cognee on Modal, with the goal of making development and productionization easier.
Cognee transforms your data into a living knowledge graph that learns from feedback and auto-tunes to deliver better answers over time.
Self-hosted package:
**Run anywhere:**
- 🏠 **Self-Hosted**: Runs locally, data stays on your device
- ☁️ **Cognee Cloud**: Same open-source Cognee, deployed on Modal for seamless workflows
- Interconnects any kind of documents: past conversations, files, images, and audio transcriptions
- Replaces RAG systems with a memory layer based on graphs and vectors
- Reduces developer effort and cost, while increasing quality and precision
- Provides Pythonic data pipelines that manage data ingestion from 30+ data sources
- Is highly customizable with custom tasks, pipelines, and a set of built-in search endpoints
Hosted platform:
- Includes a managed UI and a [hosted solution](https://www.cognee.ai)
**Self-Hosted Package:**
- Unified memory for all your data sources
- Domain-smart copilots that learn and adapt over time
- Flexible memory architecture for AI agents and devices
- Integrates easily with your current technology stack
- Pythonic data pipelines supporting 30+ data sources out of the box
- Fully extensible: customize tasks, pipelines, and search endpoints
**Cognee Cloud:**
- Get a managed UI and [Hosted Infrastructure](https://www.cognee.ai) with zero setup
## Self-Hosted (Open Source)
Run Cognee on your stack. Cognee integrates easily with your current technologies. See our [integration guides](https://docs.cognee.ai/setup-configuration/overview).
### 📦 Installation
You can install Cognee using either **pip**, **poetry**, **uv** or any other python package manager..
Install Cognee with **pip**, **poetry**, **uv**, or your preferred Python package manager.
Cognee supports Python 3.10 to 3.12
**Requirements:** Python 3.10 to 3.12
#### With uv
#### Using uv
```bash
uv pip install cognee
```
Detailed instructions can be found in our [docs](https://docs.cognee.ai/getting-started/installation#environment-configuration)
For detailed setup instructions, see our [Documentation](https://docs.cognee.ai/getting-started/installation#environment-configuration).
### 💻 Basic Usage
### 💻 Usage
#### Setup
#### Configuration
```
```python
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
```
You can also set the variables by creating .env file, using our <a href="https://github.com/topoteretes/cognee/blob/main/.env.template">template.</a>
To use different LLM providers, for more info check out our <a href="https://docs.cognee.ai/setup-configuration/llm-providers">documentation</a>
Alternatively, create a `.env` file using our [template](https://github.com/topoteretes/cognee/blob/main/.env.template).
To integrate other LLM providers, see our [LLM Provider Documentation](https://docs.cognee.ai/setup-configuration/llm-providers).
#### Simple example
#### Python Example
##### Python
This script will run the default pipeline:
Run the default pipeline with this script:
```python
import cognee
@ -163,9 +161,10 @@ Example output:
Cognee turns documents into AI memory.
```
##### Via CLI
Let's get the basics covered
#### CLI Example
Get started with these essential commands:
```
cognee-cli add "Cognee turns documents into AI memory."
@ -176,51 +175,52 @@ cognee-cli search "What does cognee do?"
cognee-cli delete --all
```
or run
Or run:
```
cognee-cli -ui
```
## Cognee Cloud
</div>
Cognee is the fastest way to start building reliable AI agent memory. Deploy in minutes with automatic updates, analytics, and enterprise-grade security.
- Sign up on [Cognee Cloud](https://www.cognee.ai)
- Add your API key to local UI and sync your data to Cognee Cloud
- Start building with managed infrastructure and zero configuration
## Trusted in Production
From regulated industries to startup stacks, Cognee is deployed in production and delivering value now. Read our [case studies](https://cognee.ai/blog) to learn more.
## Demos & Examples
See Cognee in action:
### Cognee Cloud Beta Demo
[Watch Demo](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0)
### Simple GraphRAG Demo
[Watch Demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f)
### Cognee with Ollama
[Watch Demo](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db)
### Hosted Platform
## Community & Support
Get up and running in minutes with automatic updates, analytics, and enterprise security.
### Contributing
We welcome contributions from the community! Your input helps make Cognee better for everyone. See [`CONTRIBUTING.md`](CONTRIBUTING.md) to get started.
1. Sign up on [cogwit](https://www.cognee.ai)
2. Add your API key to local UI and sync your data to Cogwit
### Code of Conduct
We're committed to fostering an inclusive and respectful community. Read our [Code of Conduct](https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md) for guidelines.
## Research & Citation
## Demos
1. Cogwit Beta demo:
[Cogwit Beta](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0)
2. Simple GraphRAG demo
[Simple GraphRAG demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f)
3. cognee with Ollama
[cognee with local models](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db)
## Contributing
Your contributions are at the core of making this a true open source project. Any contributions you make are **greatly appreciated**. See [`CONTRIBUTING.md`](CONTRIBUTING.md) for more information.
## Code of Conduct
We are committed to making open source an enjoyable and respectful experience for our community. See <a href="https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md"><code>CODE_OF_CONDUCT</code></a> for more information.
## Citation
We now have a paper you can cite:
Cite our research paper on optimizing knowledge graphs for LLM reasoning:
```bibtex
@misc{markovic2025optimizinginterfaceknowledgegraphs,