* Update cognify and the networkx client to prepare for running in Neo4j * Fix for openai model * Add the fix to the infra so that the models can be passed to the library. Enable llm_provider to be passed. * Auto graph generation now works with neo4j * Added fixes for both neo4j and networkx * Explicitly name semantic node connections * Added updated docs, readme, chunkers and updates to cognify * Make docs build trigger only when changes on it happen * Update docs, test git actions * Separate cognify logic into tasks * Introduce dspy knowledge graph extraction --------- Co-authored-by: Boris Arzentar <borisarzentar@gmail.com>
1.1 KiB
1.1 KiB
QUICKSTART
!!! tip "To understand how cognee works check out the conceptual overview"
Setup
You will need a Weaviate instance and an OpenAI API key to use cognee. Weaviate let's you run an instance for 14 days for free. You can sign up at their website: Weaviate
You can also use Ollama or Anyscale as your LLM provider. For more info on local models check here
import os
os.environ["WEAVIATE_URL"] = "YOUR_WEAVIATE_URL"
os.environ["WEAVIATE_API_KEY"] = "YOUR_WEAVIATE_API_KEY"
os.environ["OPENAI_API_KEY"] = "YOUR_OPENAI_API_KEY"
Run
import cognee
text = """Natural language processing (NLP) is an interdisciplinary
subfield of computer science and information retrieval"""
cognee.add(text) # Add a new piece of information
cognee.cognify() # Use LLMs and cognee to create knowledge
search_results = cognee.search("SIMILARITY", "computer science") # Query cognee for the knowledge
for result_text in search_results[0]:
print(result_text)