No description
Find a file
2024-06-15 09:59:33 +02:00
.data Intermidiate commit 2024-04-24 19:35:36 +02:00
.dlt fix: remove obsolete code 2024-03-13 10:19:03 +01:00
.github Add NEO4J test 2024-06-12 22:46:55 +02:00
assets Added star history 2024-05-25 08:27:49 +02:00
bin Build the docker and push 2023-12-16 15:25:32 +01:00
cognee Add NEO4J test 2024-06-12 23:40:53 +02:00
cognee-frontend fix: configure api client graph path 2024-06-12 23:25:33 +02:00
docs Shiny new LLMOps 2024-06-14 12:12:05 +02:00
evals fix: allow alternative vector db engine to be used 2024-06-06 12:31:55 +02:00
notebooks fix: enable sdk and fix config 2024-06-03 14:03:24 +02:00
tests test: add github action running weaviate integration test 2024-06-12 22:36:57 +02:00
tools Cog 174 (#84) 2024-04-26 00:16:03 +02:00
.env.template chore: rename package in files 2024-03-13 16:27:07 +01:00
.gitignore rewrote configs 2024-06-10 13:40:05 +02:00
.pylintrc feat: add create-memory and remember API endpoints 2024-02-25 23:56:50 +01:00
.python-version chore: update python version to 3.11 2024-03-29 14:10:20 +01:00
CONTRIBUTING.md updated pull request creation step 2024-05-30 14:17:45 +02:00
docker-compose.yml feat: add llm config 2024-05-22 22:36:30 +02:00
Dockerfile Update docs 2024-03-17 15:36:30 +01:00
entrypoint.sh Update docs 2024-03-17 15:36:30 +01:00
LICENSE Update LICENSE 2024-03-30 11:57:07 +01:00
mkdocs.yml Updates to the configs 2024-05-26 22:48:36 +02:00
mypy.ini Improve processing, update networkx client, and Neo4j, and dspy (#69) 2024-04-20 19:05:40 +02:00
poetry.lock test: add weaviate integration test 2024-06-12 22:32:13 +02:00
pyproject.toml chore: increase version to 0.1.12 2024-06-12 23:54:27 +02:00
pytest.ini Updates and fixes for the lib 2024-05-25 13:51:38 +02:00
README.md Add NEO4J test 2024-06-12 23:18:06 +02:00

cognee

Deterministic LLMs Outputs for AI Engineers using graphs, LLMs and vector retrieval

Cognee logo

Open-source framework for creating self-improving deterministic outputs for LLMs.

cognee forks cognee stars cognee pull-requests cognee releases

Cognee Demo

Try it in a Google collab notebook or have a look at our documentation

If you have questions, join our Discord community

📦 Installation

With pip

pip install cognee

With poetry

poetry add cognee

💻 Usage

Setup

import os

os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"

or

import cognee
cognee.config.llm_api_key = "YOUR_OPENAI_API_KEY"

If you are using Networkx, create an account on Graphistry to vizualize results:

   
   cognee.config.set_graphistry_username = "YOUR_USERNAME"
   cognee.config.set_graphistry_password = "YOUR_PASSWORD"

To run the UI, run:

docker-compose up cognee

Then navigate to localhost:3000/wizard

You can also use Ollama or Anyscale as your LLM provider. For more info on local models check our docs

Run

import cognee

text = """Natural language processing (NLP) is an interdisciplinary
       subfield of computer science and information retrieval"""

cognee.add([text], "example_dataset") # Add a new piece of information

cognee.cognify() # Use LLMs and cognee to create knowledge

search_results = cognee.search("SIMILARITY", {'query': 'Tell me about NLP'}) # Query cognee for the knowledge

print(search_results)

Add alternative data types:

cognee.add("file://{absolute_path_to_file}", dataset_name)

Or

cognee.add("data://{absolute_path_to_directory}", dataset_name)

# This is useful if you have a directory with files organized in subdirectories.
# You can target which directory to add by providing dataset_name.
# Example:
#            root
#           /    \
#      reports  bills
#     /       \
#   2024     2023
#
# cognee.add("data://{absolute_path_to_root}", "reports.2024")
# This will add just directory 2024 under reports.

Read more here.

Vector retrieval, Graphs and LLMs

Cognee supports a variety of tools and services for different operations:

  • Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI.

  • Vector Stores: Cognee supports Qdrant and Weaviate for vector storage.

  • Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider.

  • Graph Stores: In addition to LanceDB, Neo4j is also supported for graph storage.

Demo

Check out our demo notebook here

How it works

Image

Star History

Star History Chart