No description
Find a file
Hande 1b5c059b8d
chore: update litellm (#613)
<!-- .github/pull_request_template.md -->

## Description
<!-- Provide a clear description of the changes in this PR -->

## DCO Affirmation
I affirm that all code in every commit of this pull request conforms to
the terms of the Topoteretes Developer Certificate of Origin


<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

- **Chores**
- Adjusted a dependency version range for improved compatibility with
newer releases.
- Enhanced dependency management workflow by integrating Poetry and
adding a commit step for tracking changes.
- Updated Python version in the workflow to 3.12 and improved repository
checkout steps.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Co-authored-by: vasilije <vas.markovic@gmail.com>
Co-authored-by: Vasilije <8619304+Vasilije1990@users.noreply.github.com>
2025-03-07 00:17:06 +01:00
.data
.dlt
.github chore: update litellm (#613) 2025-03-07 00:17:06 +01:00
alembic Fix linter issues 2025-01-05 19:48:35 +01:00
assets fix: simplify installation in readme (#577) 2025-02-24 20:36:22 +01:00
bin
cognee fix: add proper node labels (#607) 2025-03-06 13:30:13 +01:00
cognee-frontend
cognee-mcp fix: cognee mcp docker [COG-1470] (#595) 2025-03-03 19:04:41 +01:00
evals Feature/cog 1312 integrating evaluation framework into dreamify (#562) 2025-03-03 19:55:47 +01:00
examples fix: add proper node labels (#607) 2025-03-06 13:30:13 +01:00
licenses Delete licenses/DCO.md 2024-12-13 11:29:17 +01:00
notebooks fix: tiktoken upgrade (#587) 2025-02-27 18:16:11 +01:00
profiling Fix linter issues 2025-01-05 19:48:35 +01:00
tests ruff format 2025-01-05 19:09:08 +01:00
tools ruff format 2025-01-05 19:09:08 +01:00
.dockerignore
.env.template Comment out the postgres configuration from .env.template (#502) 2025-02-06 21:35:40 +01:00
.gitignore fix: custom model pipeline (#508) 2025-02-08 02:00:15 +01:00
.pre-commit-config.yaml Feat: log pipeline status and pass it through pipeline [COG-1214] (#501) 2025-02-11 16:41:40 +01:00
.pylintrc
.python-version
alembic.ini
CODE_OF_CONDUCT.md Update CODE_OF_CONDUCT.md 2024-12-13 11:30:16 +01:00
cognee-gui.py Cognee gui (#554) 2025-02-19 03:06:50 +01:00
CONTRIBUTING.md Update CONTRIBUTING.md 2025-01-16 20:09:43 +01:00
DCO.md Create DCO.md 2024-12-13 11:28:44 +01:00
docker-compose.yml Comment out the postgres configuration from docker-compose.yml (#504) 2025-02-10 13:45:45 +01:00
Dockerfile chore: Be explicit on extras to install in Docker (#598) 2025-03-04 17:18:57 +01:00
Dockerfile_modal feat: implements modal wrapper + dockerfile for modal containers 2025-01-23 18:06:09 +01:00
entrypoint.sh feat: codegraph improvements and new CODE search [COG-1351] (#581) 2025-02-26 20:15:02 +01:00
LICENSE
modal_deployment.py refactor: Refactor search so graph completion is used by default (#505) 2025-02-07 17:16:34 +01:00
mypy.ini
NOTICE.md
poetry.lock chore: update litellm (#613) 2025-03-07 00:17:06 +01:00
pyproject.toml chore: update litellm (#613) 2025-03-07 00:17:06 +01:00
README.md chore: update readme (#614) 2025-03-06 21:58:20 +01:00

Cognee Logo

cognee - memory layer for AI apps and Agents

Learn more · Join Discord · Demo

GitHub forks GitHub stars GitHub commits Github tag Downloads License Contributors

AI Agent responses you can rely on.

Build dynamic Agent memory using scalable, modular ECL (Extract, Cognify, Load) pipelines.

More on use-cases.

Why cognee?

Features

  • Interconnect and retrieve your past conversations, documents, images and audio transcriptions
  • Reduce hallucinations, developer effort, and cost.
  • Load data to graph and vector databases using only Pydantic
  • Manipulate your data while ingesting from 30+ data sources

Get Started

Get started quickly with a Google Colab notebook or starter repo

📦 Installation

You can install Cognee using either pip, poetry, uv or any other python package manager.

With pip

pip install cognee

💻 Basic Usage

Setup

import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"

You can also set the variables by creating .env file, using our template. To use different LLM providers, for more info check out our documentation

Simple example

Add LLM_API_KEY to .env using the command bellow.

echo "LLM_API_KEY=YOUR_OPENAI_API_KEY" > .env

You can see available env variables in the repository .env.template file.

This script will run the default pipeline:

import cognee
import asyncio
from cognee.modules.search.types import SearchType

async def main():
    # Create a clean slate for cognee -- reset data and system state
    await cognee.prune.prune_data()
    await cognee.prune.prune_system(metadata=True)
    # cognee knowledge graph will be created based on this text
    text = """
    Natural language processing (NLP) is an interdisciplinary
    subfield of computer science and information retrieval.
    """

    print("Adding text to cognee:")
    print(text.strip())
    # Add the text, and make it available for cognify
    await cognee.add(text)

    # Use LLMs and cognee to create knowledge graph
    await cognee.cognify()
    print("Cognify process complete.\n")


    query_text = "Tell me about NLP"
    print(f"Searching cognee for insights with query: '{query_text}'")
    # Query cognee for insights on the added text
    search_results = await cognee.search(
        query_text=query_text, query_type=SearchType.INSIGHTS
    )

    print("Search results:")
    # Display results
    for result_text in search_results:
        print(result_text)

    # Example output:
       # ({'id': UUID('bc338a39-64d6-549a-acec-da60846dd90d'), 'updated_at': datetime.datetime(2024, 11, 21, 12, 23, 1, 211808, tzinfo=datetime.timezone.utc), 'name': 'natural language processing', 'description': 'An interdisciplinary subfield of computer science and information retrieval.'}, {'relationship_name': 'is_a_subfield_of', 'source_node_id': UUID('bc338a39-64d6-549a-acec-da60846dd90d'), 'target_node_id': UUID('6218dbab-eb6a-5759-a864-b3419755ffe0'), 'updated_at': datetime.datetime(2024, 11, 21, 12, 23, 15, 473137, tzinfo=datetime.timezone.utc)}, {'id': UUID('6218dbab-eb6a-5759-a864-b3419755ffe0'), 'updated_at': datetime.datetime(2024, 11, 21, 12, 23, 1, 211808, tzinfo=datetime.timezone.utc), 'name': 'computer science', 'description': 'The study of computation and information processing.'})
       # (...)
        #
        # It represents nodes and relationships in the knowledge graph:
        # - The first element is the source node (e.g., 'natural language processing').
        # - The second element is the relationship between nodes (e.g., 'is_a_subfield_of').
        # - The third element is the target node (e.g., 'computer science').

if __name__ == '__main__':
    asyncio.run(main())

For more advanced usage, have a look at our documentation.

Understand our architecture

cognee concept diagram

Demos

What is AI memory:

Contributing

Your contributions are at the core of making this a true open source project. Any contributions you make are greatly appreciated. See CONTRIBUTING.md for more information.

Code of Conduct

We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.

💫 Contributors

contributors

Star History

Star History Chart