<!-- .github/pull_request_template.md --> ## Description - Fixes MCP server communication issue by switching to sys.stderr ( as is default for python loggin ) - Adds needed api optional dependency for fastapi users - Removes lock file as a new one will need to be made after new Cognee release with api optional dependency - Adds log file location to MCP tool call answer ## DCO Affirmation I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin |
||
|---|---|---|
| .data | ||
| .dlt | ||
| .github | ||
| alembic | ||
| assets | ||
| bin | ||
| cognee | ||
| cognee-frontend | ||
| cognee-mcp | ||
| deployment | ||
| evals | ||
| examples | ||
| helm | ||
| licenses | ||
| logs | ||
| notebooks | ||
| profiling | ||
| tests | ||
| tools | ||
| .dockerignore | ||
| .env.template | ||
| .gitignore | ||
| .pre-commit-config.yaml | ||
| .pylintrc | ||
| .python-version | ||
| alembic.ini | ||
| CODE_OF_CONDUCT.md | ||
| cognee-gui.py | ||
| CONTRIBUTING.md | ||
| DCO.md | ||
| docker-compose.yml | ||
| Dockerfile | ||
| Dockerfile_modal | ||
| entrypoint.sh | ||
| LICENSE | ||
| modal_deployment.py | ||
| mypy.ini | ||
| NOTICE.md | ||
| poetry.lock | ||
| pyproject.toml | ||
| README.md | ||
cognee - memory layer for AI apps and Agents
Demo . Learn more · Join Discord
AI Agent responses you can rely on.
Build dynamic Agent memory using scalable, modular ECL (Extract, Cognify, Load) pipelines.
More on use-cases.
Features
- Interconnect and retrieve your past conversations, documents, images and audio transcriptions
- Reduce hallucinations, developer effort, and cost.
- Load data to graph and vector databases using only Pydantic
- Manipulate your data while ingesting from 30+ data sources
Get Started
Get started quickly with a Google Colab notebook or starter repo
Contributing
Your contributions are at the core of making this a true open source project. Any contributions you make are greatly appreciated. See CONTRIBUTING.md for more information.
📦 Installation
You can install Cognee using either pip, poetry, uv or any other python package manager.
With pip
pip install cognee
💻 Basic Usage
Setup
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
You can also set the variables by creating .env file, using our template. To use different LLM providers, for more info check out our documentation
Simple example
This script will run the default pipeline:
import cognee
import asyncio
async def main():
# Add text to cognee
await cognee.add("Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval.")
# Generate the knowledge graph
await cognee.cognify()
# Query the knowledge graph
results = await cognee.search("Tell me about NLP")
# Display the results
for result in results:
print(result)
if __name__ == '__main__':
asyncio.run(main())
Example output:
# ({'id': UUID('bc338a39-64d6-549a-acec-da60846dd90d'), 'updated_at': datetime.datetime(2024, 11, 21, 12, 23, 1, 211808, tzinfo=datetime.timezone.utc), 'name': 'natural language processing', 'description': 'An interdisciplinary subfield of computer science and information retrieval.'}, {'relationship_name': 'is_a_subfield_of', 'source_node_id': UUID('bc338a39-64d6-549a-acec-da60846dd90d'), 'target_node_id': UUID('6218dbab-eb6a-5759-a864-b3419755ffe0'), 'updated_at': datetime.datetime(2024, 11, 21, 12, 23, 15, 473137, tzinfo=datetime.timezone.utc)}, {'id': UUID('6218dbab-eb6a-5759-a864-b3419755ffe0'), 'updated_at': datetime.datetime(2024, 11, 21, 12, 23, 1, 211808, tzinfo=datetime.timezone.utc), 'name': 'computer science', 'description': 'The study of computation and information processing.'})
Graph visualization:
Open in browser.
For more advanced usage, have a look at our documentation.
Understand our architecture
Demos
What is AI memory:
Code of Conduct
We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.