1.8 KiB
PromethAI-Memory
Memory management for the AI Applications and AI Agents
The Motivation
Browsing the database of theresanaiforthat.com, we can observe around 7000 new, mostly semi-finished projects in the field of applied AI, whose development is fueled by new improvements in foundation models and open-source community contributions.
It seems it has never been easier to create a startup, build an app, and go to market… and fail.
Decades of technological advancements have led to small teams being able to do in 2023 what in 2015 required a team of dozens.
Yet, the AI apps currently being pushed out still mostly feel and perform like demos.
The rise of this new profession is perhaps signaling the need for a solution that is not yet there — a solution that in its essence represents a Large Language Model (LLM) — a powerful general problem solver — available in the palm of your hand 24/7/365.
To address this issue, dlthub and prometh.ai will collaborate on a productionizing a common use-case, progressing step by step. We will utilize the LLMs, frameworks, and services, refining the code until we attain a clearer understanding of what a modern LLM architecture stack might entail.
Read more on our blog post prometh.ai
PromethAI-Memory Repo Structure
The repository contains a set of folders that represent the steps in the evolution of the modern data stack from POC to production
- Level 1 - CMD script to process PDFs
- Level 2 - Memory Manager implemented in Python
