Bumps the npm_and_yarn group with 2 updates in the /cognee-frontend directory: [next](https://github.com/vercel/next.js) and [preact](https://github.com/preactjs/preact). Updates `next` from 16.0.4 to 16.1.1 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/vercel/next.js/releases">next's releases</a>.</em></p> <blockquote> <h2>v16.1.1</h2> <blockquote> <p>[!NOTE] This release is backporting bug fixes. It does <strong>not</strong> include all pending features/changes on canary.</p> </blockquote> <h3>Core Changes</h3> <ul> <li>Turbopack: Create junction points instead of symlinks on Windows (<a href="https://redirect.github.com/vercel/next.js/issues/87606">#87606</a>)</li> </ul> <h3>Credits</h3> <p>Huge thanks to <a href="https://github.com/sokra"><code>@sokra</code></a> and <a href="https://github.com/ztanner"><code>@ztanner</code></a> for helping!</p> <h2>v16.1.1-canary.16</h2> <h3>Core Changes</h3> <ul> <li>Add maximum size limit for postponed body parsing: <a href="https://redirect.github.com/vercel/next.js/issues/88175">#88175</a></li> <li>metadata: use fixed segment in dynamic routes with static metadata files: <a href="https://redirect.github.com/vercel/next.js/issues/88113">#88113</a></li> <li>feat: add --experimental-cpu-prof flag for dev, build, and start: <a href="https://redirect.github.com/vercel/next.js/issues/87946">#87946</a></li> <li>Add experimental option to use no-cache instead of no-store in dev: <a href="https://redirect.github.com/vercel/next.js/issues/88182">#88182</a></li> </ul> <h3>Misc Changes</h3> <ul> <li>fix: move conductor.json to repo root for proper detection: <a href="https://redirect.github.com/vercel/next.js/issues/88184">#88184</a></li> <li>Turbopack: Update to swc_core v50.2.3: <a href="https://redirect.github.com/vercel/next.js/issues/87841">#87841</a></li> <li>Update generateMetadata in client component error: <a href="https://redirect.github.com/vercel/next.js/issues/88172">#88172</a></li> <li>ci: run stats on canary pushes for historical trend tracking: <a href="https://redirect.github.com/vercel/next.js/issues/88157">#88157</a></li> <li>perf: improve stats thresholds to reduce CI noise: <a href="https://redirect.github.com/vercel/next.js/issues/88158">#88158</a></li> </ul> <h3>Credits</h3> <p>Huge thanks to <a href="https://github.com/wyattjoh"><code>@wyattjoh</code></a>, <a href="https://github.com/bgw"><code>@bgw</code></a>, <a href="https://github.com/timneutkens"><code>@timneutkens</code></a>, <a href="https://github.com/feedthejim"><code>@feedthejim</code></a>, and <a href="https://github.com/huozhi"><code>@huozhi</code></a> for helping!</p> <h2>v16.1.1-canary.15</h2> <h3>Core Changes</h3> <ul> <li>add compilation error for taint when not enabled: <a href="https://redirect.github.com/vercel/next.js/issues/88173">#88173</a></li> <li>feat(next/image)!: add <code>images.maximumResponseBody</code> config: <a href="https://redirect.github.com/vercel/next.js/issues/88183">#88183</a></li> </ul> <h3>Misc Changes</h3> <ul> <li>Update Rspack production test manifest: <a href="https://redirect.github.com/vercel/next.js/issues/88137">#88137</a></li> <li>Update Rspack development test manifest: <a href="https://redirect.github.com/vercel/next.js/issues/88138">#88138</a></li> <li>Fix compile error when running next-custom-transform tests: <a href="https://redirect.github.com/vercel/next.js/issues/83715">#83715</a></li> <li>chore: add Conductor configuration for parallel development: <a href="https://redirect.github.com/vercel/next.js/issues/88116">#88116</a></li> <li>[docs] add get_routes in mcp available tools: <a href="https://redirect.github.com/vercel/next.js/issues/88181">#88181</a></li> </ul> <h3>Credits</h3> <p>Huge thanks to <a href="https://github.com/vercel-release-bot"><code>@vercel-release-bot</code></a>, <a href="https://github.com/ztanner"><code>@ztanner</code></a>, <a href="https://github.com/mischnic"><code>@mischnic</code></a>, <a href="https://github.com/wyattjoh"><code>@wyattjoh</code></a>, <a href="https://github.com/huozhi"><code>@huozhi</code></a>, and <a href="https://github.com/styfle"><code>@styfle</code></a> for helping!</p> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href=" |
||
|---|---|---|
| .github | ||
| alembic | ||
| assets | ||
| bin | ||
| cognee | ||
| cognee-frontend | ||
| cognee-mcp | ||
| cognee-starter-kit | ||
| deployment | ||
| distributed | ||
| evals | ||
| examples | ||
| licenses | ||
| logs | ||
| notebooks | ||
| tools | ||
| working_dir_error_replication | ||
| .coderabbit.yaml | ||
| .dockerignore | ||
| .env.template | ||
| .gitattributes | ||
| .gitguardian.yml | ||
| .gitignore | ||
| .mergify.yml | ||
| .pre-commit-config.yaml | ||
| .pylintrc | ||
| AGENTS.md | ||
| alembic.ini | ||
| CODE_OF_CONDUCT.md | ||
| CONTRIBUTING.md | ||
| CONTRIBUTORS.md | ||
| DCO.md | ||
| docker-compose.yml | ||
| Dockerfile | ||
| entrypoint.sh | ||
| LICENSE | ||
| mypy.ini | ||
| NOTICE.md | ||
| poetry.lock | ||
| pyproject.toml | ||
| README.md | ||
| SECURITY.md | ||
| uv.lock | ||
Cognee - Accurate and Persistent AI Memory
Demo . Docs . Learn More · Join Discord · Join r/AIMemory . Community Plugins & Add-ons
Use your data to build personalized and dynamic memory for AI Agents. Cognee lets you replace RAG with scalable and modular ECL (Extract, Cognify, Load) pipelines.
🌐 Available Languages : Deutsch | Español | Français | 日本語 | 한국어 | Português | Русский | 中文
About Cognee
Cognee is an open-source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your documents both searchable by meaning and connected by relationships.
You can use Cognee in two ways:
- Self-host Cognee Open Source, which stores all data locally by default.
- Connect to Cognee Cloud, and get the same OSS stack on managed infrastructure for easier development and productionization.
Cognee Open Source (self-hosted):
- Interconnects any type of data — including past conversations, files, images, and audio transcriptions
- Replaces traditional RAG systems with a unified memory layer built on graphs and vectors
- Reduces developer effort and infrastructure cost while improving quality and precision
- Provides Pythonic data pipelines for ingestion from 30+ data sources
- Offers high customizability through user-defined tasks, modular pipelines, and built-in search endpoints
Cognee Cloud (managed):
- Hosted web UI dashboard
- Automatic version updates
- Resource usage analytics
- GDPR compliant, enterprise-grade security
Basic Usage & Feature Guide
To learn more, check out this short, end-to-end Colab walkthrough of Cognee's core features.
Quickstart
Let’s try Cognee in just a few lines of code. For detailed setup and configuration, see the Cognee Docs.
Prerequisites
- Python 3.10 to 3.13
Step 1: Install Cognee
You can install Cognee with pip, poetry, uv, or your preferred Python package manager.
uv pip install cognee
Step 2: Configure the LLM
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
Alternatively, create a .env file using our template.
To integrate other LLM providers, see our LLM Provider Documentation.
Step 3: Run the Pipeline
Cognee will take your documents, generate a knowledge graph from them and then query the graph based on combined relationships.
Now, run a minimal pipeline:
import cognee
import asyncio
from pprint import pprint
async def main():
# Add text to cognee
await cognee.add("Cognee turns documents into AI memory.")
# Generate the knowledge graph
await cognee.cognify()
# Add memory algorithms to the graph
await cognee.memify()
# Query the knowledge graph
results = await cognee.search("What does Cognee do?")
# Display the results
for result in results:
pprint(result)
if __name__ == '__main__':
asyncio.run(main())
As you can see, the output is generated from the document we previously stored in Cognee:
Cognee turns documents into AI memory.
Use the Cognee CLI
As an alternative, you can get started with these essential commands:
cognee-cli add "Cognee turns documents into AI memory."
cognee-cli cognify
cognee-cli search "What does Cognee do?"
cognee-cli delete --all
To open the local UI, run:
cognee-cli -ui
Demos & Examples
See Cognee in action:
Persistent Agent Memory
Cognee Memory for LangGraph Agents
Simple GraphRAG
Cognee with Ollama
Community & Support
Contributing
We welcome contributions from the community! Your input helps make Cognee better for everyone. See CONTRIBUTING.md to get started.
Code of Conduct
We're committed to fostering an inclusive and respectful community. Read our Code of Conduct for guidelines.
Research & Citation
We recently published a research paper on optimizing knowledge graphs for LLM reasoning:
@misc{markovic2025optimizinginterfaceknowledgegraphs,
title={Optimizing the Interface Between Knowledge Graphs and LLMs for Complex Reasoning},
author={Vasilije Markovic and Lazar Obradovic and Laszlo Hajdu and Jovan Pavlovic},
year={2025},
eprint={2505.24478},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2505.24478},
}