Initial basic onboarding project

This commit is contained in:
hajdul88 2025-04-28 17:11:18 +02:00
parent c4915a4136
commit a4496ce047
15 changed files with 349 additions and 0 deletions

View file

View file

@ -0,0 +1,54 @@
# AssociationLayerDemo Crew
Welcome to the AssociationLayerDemo Crew project, powered by [crewAI](https://crewai.com). This template is designed to help you set up a multi-agent AI system with ease, leveraging the powerful and flexible framework provided by crewAI. Our goal is to enable your agents to collaborate effectively on complex tasks, maximizing their collective intelligence and capabilities.
## Installation
Ensure you have Python >=3.10 <3.13 installed on your system. This project uses [UV](https://docs.astral.sh/uv/) for dependency management and package handling, offering a seamless setup and execution experience.
First, if you haven't already, install uv:
```bash
pip install uv
```
Next, navigate to your project directory and install the dependencies:
(Optional) Lock the dependencies and install them by using the CLI command:
```bash
crewai install
```
### Customizing
**Add your `OPENAI_API_KEY` into the `.env` file**
- Modify `src/association_layer_demo/config/agents.yaml` to define your agents
- Modify `src/association_layer_demo/config/tasks.yaml` to define your tasks
- Modify `src/association_layer_demo/crew.py` to add your own logic, tools and specific args
- Modify `src/association_layer_demo/main.py` to add custom inputs for your agents and tasks
## Running the Project
To kickstart your crew of AI agents and begin task execution, run this from the root folder of your project:
```bash
$ crewai run
```
This command initializes the association-layer-demo Crew, assembling the agents and assigning them tasks as defined in your configuration.
This example, unmodified, will run the create a `report.md` file with the output of a research on LLMs in the root folder.
## Understanding Your Crew
The association-layer-demo Crew is composed of multiple AI agents, each with unique roles, goals, and tools. These agents collaborate on a series of tasks, defined in `config/tasks.yaml`, leveraging their collective skills to achieve complex objectives. The `config/agents.yaml` file outlines the capabilities and configurations of each agent in your crew.
## Support
For support, questions, or feedback regarding the AssociationLayerDemo Crew or crewAI.
- Visit our [documentation](https://docs.crewai.com)
- Reach out to us through our [GitHub repository](https://github.com/joaomdmoura/crewai)
- [Join our Discord](https://discord.com/invite/X4JWnZnxPb)
- [Chat with our docs](https://chatg.pt/DWjSBZn)
Let's create wonders together with the power and simplicity of crewAI.

View file

@ -0,0 +1,4 @@
User name is John Doe.
User is an AI Engineer.
User is interested in AI Agents.
User is based in San Francisco, California.

View file

@ -0,0 +1,23 @@
[project]
name = "association_layer_demo"
version = "0.1.0"
description = "association-layer-demo using crewAI"
authors = [{ name = "Your Name", email = "you@example.com" }]
requires-python = ">=3.10,<3.13"
dependencies = [
"crewai[tools]>=0.114.0,<1.0.0"
]
[project.scripts]
association_layer_demo = "association_layer_demo.main:run"
run_crew = "association_layer_demo.main:run"
train = "association_layer_demo.main:train"
replay = "association_layer_demo.main:replay"
test = "association_layer_demo.main:test"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.crewai]
type = "crew"

View file

@ -0,0 +1,33 @@
This is a demo project to showcase and test how cognee and CrewAI can work together:
Short description:
There were two basic agents implemented:
1 - Data Ingestion Expert:
- Performs google search about cognee using Serper (SERPER_API_KEY is needed).
- Ingests all the search results into cognee metastore using cognee.add
- Performs cognify step using the cognee.cognify
2 - Cognee Search Expert
- Uses the cognee.search (graph completion, other search tools can be added)
- Asks multiple questions to research what is cognee
The following tools were implemented:
- Serper search tool: Searches the internet via google (Ingestion AGENT)
- Cognee add: adds google search results to cognee's metastore (Ingestion AGENT)
- Cognee cognify: cognifies the added documents (Ingestion AGENT)
- Cognee Search searches the cognee memory (Search AGENT)
Tried but not the part of the pushed code:
- agents delegating each other
- collaboration and communication via pydantic
- File reading and directory discovery using agents + file ingestion into cognee
In general agents are working together to search what cognee is, ingest the knowledge into cognee
build the knowledgegraph and perform different searches to collect the knowledge about cognee.
Works from IDE and not from CLI,(one import has to be changed then)

View file

@ -0,0 +1,16 @@
ingestion_agent:
role: >
Data ingestion Expert
goal: >
Your goal is to search the internet and ingest the data you found into cognee's system
backstory: >
You are a database ingestion expert with more than 20 years of experience
search_agent:
role: >
Cognee Search expert
goal: >
Your goal is to search for information in the cognee database and try to answer the questions you get.
You cannot search on the internet.
backstory: >
You are a a Cognee search expert who's expertise is to use databases, do research on them with different search tools in order to answer questions.

View file

@ -0,0 +1,24 @@
google_task:
description: >
Search on the internet for cognee. You can execute multiple searches
til you think he information is enough to answer everything about Cognee.
Perform at least 5 different searches about cognee and Pass all the google search results to cognee.add separately.
expected_output: >
yes if all tool usage was successful false otherwise
agent: ingestion_agent
cognify_task:
description: >
After you added every search result to cognee.add, Cognify the ingested dataset with cognee.cognify.
expected_output: >
yes if all tool usage was successful false otherwise
agent: ingestion_agent
search_task:
description: >
Once everything is done and the dataset is cognified. Wait till all the executions are finished.
Research what is cognee using the cognee search tool.
Ask at least 3 questions about the topic.
expected_output: >
The answer to the question.
agent: search_agent

View file

@ -0,0 +1,42 @@
from crewai.tools import BaseTool
from typing import Type, List, Optional
from pydantic import BaseModel, Field
class CogneeAddInput(BaseModel):
file_content: Optional[str] = Field(
None, description="The file content to add to Cognee memory."
)
class CogneeAdd(BaseTool):
name: str = "Cognee Memory ADD"
description: str = "Add data to cognee memory to store data in memory for AI memory"
args_schema: Type[BaseModel] = CogneeAddInput
pruned: bool = False
def _run(self, **kwargs) -> str:
import cognee
import asyncio
async def main():
try:
if not self.pruned:
print("Pruning data…")
await cognee.prune.prune_data()
await cognee.prune.prune_system(metadata=True)
self.pruned = True
await cognee.add(kwargs.get("file_content"))
return True
except Exception as e:
return f"Error: {str(e)}"
try:
loop = asyncio.get_event_loop()
if loop.is_running():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(main())
return "File added"
except Exception as e:
return f"Tool execution error: {str(e)}"

View file

@ -0,0 +1,28 @@
from crewai.tools import BaseTool
class CogneeCognify(BaseTool):
name: str = "Cognee Memory COGNIFY"
description: str = "This tool can be used to cognify the ingested dataset with Cognee"
def _run(self, **kwargs) -> str:
import cognee
import asyncio
async def main():
try:
print("Cognifying dataset…")
await cognee.cognify()
return True
except Exception as e:
return f"Error: {str(e)}"
try:
loop = asyncio.get_event_loop()
if loop.is_running():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(main())
return "Dataset cognified"
except Exception as e:
return f"Tool execution error: {str(e)}"

View file

@ -0,0 +1,43 @@
from crewai.tools import BaseTool
from typing import Type, List, Optional
from pydantic import BaseModel, Field
from cognee.api.v1.search import SearchType
from cognee.modules.search.methods import search
class CogneeSearchInput(BaseModel):
query: Optional[str] = Field(
None, description="The query/question provided to the search engine"
)
class CogneeSearch(BaseTool):
name: str = "Cognee Memory SEARCH"
description: str = (
"Search inside the cognee memory engine, providing different questions/queries to answer."
)
args_schema: Type[BaseModel] = CogneeSearchInput
pruned: bool = False
def _run(self, **kwargs) -> str:
import cognee
import asyncio
async def main():
try:
search_results = await cognee.search(
query_type=SearchType.GRAPH_COMPLETION, query_text=kwargs.get("query")
)
return search_results
except Exception as e:
return f"Error: {str(e)}"
try:
loop = asyncio.get_event_loop()
if loop.is_running():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
search_results = loop.run_until_complete(main())
return search_results
except Exception as e:
return f"Tool execution error: {str(e)}"

View file

@ -0,0 +1,62 @@
from crewai import Agent, Crew, Process, Task
from crewai.project import CrewBase, agent, crew, task, before_kickoff
from custom_tools.cognee_add import CogneeAdd
from custom_tools.cognee_cognify import CogneeCognify
from custom_tools.cognee_search import CogneeSearch
from crewai_tools import SerperDevTool
import os
@CrewBase
class IngestionCrew:
@before_kickoff
def dump_env(self, *args, **kwargs):
"""Print environment variables at startup."""
print("=== Environment Variables ===")
for key in sorted(os.environ):
print(f"{key}={os.environ[key]}")
agents_config = "config/agents.yaml"
tasks_config = "config/tasks.yaml"
@agent
def ingestion_agent(self) -> Agent:
return Agent(
config=self.agents_config["ingestion_agent"],
tools=[CogneeAdd(), CogneeCognify(), SerperDevTool()],
verbose=True,
allow_delegation=True,
)
@agent
def search_agent(self) -> Agent:
return Agent(
config=self.agents_config["search_agent"], tools=[CogneeSearch()], verbose=True
)
@task
def search_on_google(self) -> Task:
return Task(config=self.tasks_config["google_task"], async_execution=False)
@task
def cognify(self) -> Task:
return Task(
config=self.tasks_config["cognify_task"],
async_execution=False,
)
@task
def search(self) -> Task:
return Task(config=self.tasks_config["search_task"], async_execution=False)
@crew
def crew(self) -> Crew:
print(self.tasks)
return Crew(
agents=self.agents,
tasks=self.tasks,
process=Process.sequential,
verbose=True,
share_crew=True,
output_log_file="logs.txt",
)

View file

@ -0,0 +1,20 @@
#!/usr/bin/env python
import warnings
import os
from ingestion_crew import IngestionCrew
import cognee
import asyncio
# from association_layer_demo.ingestion_crew import IngestionCrew
warnings.filterwarnings("ignore", category=SyntaxWarning, module="pysbd")
def run():
try:
IngestionCrew().crew().kickoff()
except Exception as e:
raise Exception(f"An error occurred while running the crew: {e}")
if __name__ == "__main__":
run()