| .data | ||
| .dlt | ||
| .github | ||
| assets | ||
| bin | ||
| cognee | ||
| cognee-frontend | ||
| docs | ||
| evals | ||
| notebooks | ||
| tools | ||
| .env.template | ||
| .gitignore | ||
| .pylintrc | ||
| .python-version | ||
| CONTRIBUTING.md | ||
| docker-compose.yml | ||
| Dockerfile | ||
| entrypoint.sh | ||
| LICENSE | ||
| mkdocs.yml | ||
| mypy.ini | ||
| poetry.lock | ||
| pyproject.toml | ||
| README.md | ||
cognee
Deterministic LLMs Outputs for AI Engineers using graphs, LLMs and vector retrieval
Open-source framework for creating self-improving deterministic outputs for LLMs.
Try it in a Google collab notebook or have a look at our documentation
Join our Discord community
📦 Installation
With pip
pip install cognee
With poetry
poetry add cognee
💻 Usage
Setup
import os
os.environ["OPENAI_API_KEY"] = "YOUR OPENAI_API_KEY"
You can also use Ollama or Anyscale as your LLM provider. For more info on local models check our docs
Run
import cognee
text = """Natural language processing (NLP) is an interdisciplinary
subfield of computer science and information retrieval"""
cognee.add([text], "example_dataset") # Add a new piece of information
cognee.cognify() # Use LLMs and cognee to create knowledge
search_results = cognee.search("SIMILARITY", "computer science") # Query cognee for the knowledge
for result_text in search_results[0]:
print(result_text)
Add alternative data types:
cognee.add("file://{absolute_path_to_file}", dataset_name)
Or
cognee.add("data://{absolute_path_to_directory}", dataset_name)
# This is useful if you have a directory with files organized in subdirectories.
# You can target which directory to add by providing dataset_name.
# Example:
# root
# / \
# reports bills
# / \
# 2024 2023
#
# cognee.add("data://{absolute_path_to_root}", "reports.2024")
# This will add just directory 2024 under reports.
Read more here.
Vector retrieval, Graphs and LLMs
Cognee supports a variety of tools and services for different operations:
-
Local Setup: By default, LanceDB runs locally with NetworkX and OpenAI.
-
Vector Stores: Cognee supports Qdrant and Weaviate for vector storage.
-
Language Models (LLMs): You can use either Anyscale or Ollama as your LLM provider.
-
Graph Stores: In addition to LanceDB, Neo4j is also supported for graph storage.
Demo
Check out our demo notebook here


