* feat: Add support for LlamaIndex Document type Added support for LlamaIndex Document type Feature #COG-337 * docs: Add Jupyer Notebook for cognee with llama index document type Added jupyter notebook which demonstrates cognee with LlamaIndex document type usage Docs #COG-337 * feat: Add metadata migration from LlamaIndex document type Allow usage of metadata from LlamaIndex documents Feature #COG-337 * refactor: Change llama index migration function name Change name of llama index function Refactor #COG-337 * chore: Add llama index core dependency Downgrade needed on tenacity and instructor modules to support llama index Chore #COG-337 * Feature: Add ingest_data_with_metadata task Added task that will have access to metadata if data is provided from different data ingestion tools Feature #COG-337 * docs: Add description on why specific type checking is done Explained why specific type checking is used instead of isinstance, as isinstace returns True for child classes as well Docs #COG-337 * fix: Add missing parameter to function call Added missing parameter to function call Fix #COG-337 * refactor: Move storing of data from async to sync function Moved data storing from async to sync Refactor #COG-337 * refactor: Pretend ingest_data was changes instead of having two tasks Refactor so ingest_data file was modified instead of having two ingest tasks Refactor #COG-337 * refactor: Use old name for data ingestion with metadata Merged new and old data ingestion tasks into one Refactor #COG-337 * refactor: Return ingest_data and save_data_to_storage Tasks Returned ingest_data and save_data_to_storage tasks Refactor #COG-337 * refactor: Return previous ingestion Tasks to add function Returned previous ignestion tasks to add function Refactor #COG-337 * fix: Remove dict and use string for search query Remove dictionary and use string for query in notebook and simple example Fix COG-337 * refactor: Add changes request in pull request Added the following changes that were requested in pull request: Added synchronize label, Made uniform syntax in if statement in workflow, fixed instructor dependency, added llama-index to be optional Refactor COG-337 * fix: Resolve issue with llama-index being mandatory Resolve issue with llama-index being mandatory to run cognee Fix COG-337 * fix: Add install of llama-index to notebook Removed additional references to llama-index from core cognee lib. Added llama-index-core install from notebook Fix COG-337 ---------
38 lines
1.2 KiB
Python
38 lines
1.2 KiB
Python
import cognee
|
|
import asyncio
|
|
from cognee.api.v1.search import SearchType
|
|
|
|
# Prerequisites:
|
|
# 1. Copy `.env.template` and rename it to `.env`.
|
|
# 2. Add your OpenAI API key to the `.env` file in the `LLM_API_KEY` field:
|
|
# LLM_API_KEY = "your_key_here"
|
|
# 3. (Optional) To minimize setup effort, set `VECTOR_DB_PROVIDER="lancedb"` in `.env".
|
|
|
|
async def main():
|
|
# Create a clean slate for cognee -- reset data and system state
|
|
await cognee.prune.prune_data()
|
|
await cognee.prune.prune_system(metadata=True)
|
|
|
|
# cognee knowledge graph will be created based on this text
|
|
text = """
|
|
Natural language processing (NLP) is an interdisciplinary
|
|
subfield of computer science and information retrieval.
|
|
"""
|
|
|
|
# Add the text, and make it available for cognify
|
|
await cognee.add(text)
|
|
|
|
# Use LLMs and cognee to create knowledge graph
|
|
await cognee.cognify()
|
|
|
|
# Query cognee for insights on the added text
|
|
search_results = await cognee.search(
|
|
SearchType.INSIGHTS, query='Tell me about NLP'
|
|
)
|
|
|
|
# Display search results
|
|
for result_text in search_results:
|
|
print(result_text)
|
|
|
|
if __name__ == '__main__':
|
|
asyncio.run(main())
|