Cognee Logo
cognee - Memory for AI Agents in 5 lines of code

Demo . Learn more · Join Discord · Join r/AIMemory . Docs . cognee community repo

[![GitHub forks](https://img.shields.io/github/forks/topoteretes/cognee.svg?style=social&label=Fork&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/network/) [![GitHub stars](https://img.shields.io/github/stars/topoteretes/cognee.svg?style=social&label=Star&maxAge=2592000)](https://GitHub.com/topoteretes/cognee/stargazers/) [![GitHub commits](https://badgen.net/github/commits/topoteretes/cognee)](https://GitHub.com/topoteretes/cognee/commit/) [![Github tag](https://badgen.net/github/tag/topoteretes/cognee)](https://github.com/topoteretes/cognee/tags/) [![Downloads](https://static.pepy.tech/badge/cognee)](https://pepy.tech/project/cognee) [![License](https://img.shields.io/github/license/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/blob/main/LICENSE) [![Contributors](https://img.shields.io/github/contributors/topoteretes/cognee?colorA=00C586&colorB=000000)](https://github.com/topoteretes/cognee/graphs/contributors) Sponsor

cognee - Memory for AI Agents  in 5 lines of code | Product Hunt topoteretes%2Fcognee | Trendshift

Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Extract, Cognify, Load) pipelines.

🌐 Available Languages : Deutsch | Español | français | 日本語 | 한국어 | Português | Русский | 中文

Why cognee?
## Features - Interconnect and retrieve your past conversations, documents, images and audio transcriptions - Replaces RAG systems and reduces developer effort, and cost. - Load data to graph and vector databases using only Pydantic - Manipulate your data while ingesting from 30+ data sources ## Get Started Get started quickly with a Google Colab notebook , Deepnote notebook or starter repo ## Using cognee Self-hosted package: - Get self-serve UI with embedded Python notebooks - Add custom tasks and pipelines via Python SDK - Get Docker images and MCP servers you can deploy - Use distributed cognee SDK to process a TBs of your data - Use community adapters to connect to Redis, Azure, Falkor and others Hosted platform: - Sync your local data to our [hosted solution](www.cognee.ai) - Get a secure API endpoint - We manage the UI for you ## Self-Hosted (Open Source) ### 📦 Installation You can install Cognee using either **pip**, **poetry**, **uv** or any other python package manager. Cognee supports Python 3.10 to 3.12 #### With uv ```bash uv pip install cognee ``` Detailed instructions can be found in our [docs](https://docs.cognee.ai/getting-started/installation#environment-configuration) ### 💻 Basic Usage #### Setup ``` import os os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY" ``` You can also set the variables by creating .env file, using our template. To use different LLM providers, for more info check out our documentation #### Simple example ##### Python This script will run the default pipeline: ```python import cognee import asyncio async def main(): # Add text to cognee await cognee.add("Cognee turns documents into AI memory.") # Generate the knowledge graph await cognee.cognify() # Add memory algorithms to the graph await cognee.memify() # Query the knowledge graph results = await cognee.search("What does cognee do?") # Display the results for result in results: print(result) if __name__ == '__main__': asyncio.run(main()) ``` Example output: ``` Cognee turns documents into AI memory. ``` ##### Via CLI Let's get the basics covered ``` cognee-cli add "Cognee turns documents into AI memory." cognee-cli cognify cognee-cli search "What does cognee do?" cognee-cli delete --all ``` or run ``` cognee-cli -ui ``` ### Hosted Platform Get up and running in minutes with automatic updates, analytics, and enterprise security. 1. Sign up on [cogwit](https://www.cognee.ai) 2. Add your API key to local UI and sync your data to Cogwit ## Demos 1. Cogwit Beta demo: [Cogwit Beta](https://github.com/user-attachments/assets/fa520cd2-2913-4246-a444-902ea5242cb0) 2. Simple GraphRAG demo [Simple GraphRAG demo](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f) 3. cognee with Ollama [cognee with local models](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db) ## Contributing Your contributions are at the core of making this a true open source project. Any contributions you make are **greatly appreciated**. See [`CONTRIBUTING.md`](CONTRIBUTING.md) for more information. ## Code of Conduct We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information. ## Citation We now have a paper you can cite: ```bibtex @misc{markovic2025optimizinginterfaceknowledgegraphs, title={Optimizing the Interface Between Knowledge Graphs and LLMs for Complex Reasoning}, author={Vasilije Markovic and Lazar Obradovic and Laszlo Hajdu and Jovan Pavlovic}, year={2025}, eprint={2505.24478}, archivePrefix={arXiv}, primaryClass={cs.AI}, url={https://arxiv.org/abs/2505.24478}, } ```