Add base SQL query, rework docs a bit

This commit is contained in:
Vasilije 2023-10-31 11:45:22 +01:00
parent 9237ba11f4
commit 6a13072d16
2 changed files with 39 additions and 4 deletions

View file

@ -98,13 +98,13 @@ RAG test manager can be used via API or via the CLI
#### Level 1 - OpenAI functions + Pydantic + DLTHub
Scope: Give PDFs to the model and get the output in a structured format
Blog post: [Link] (https://prometh.ai/promethai-memory-blog-post-one)
Blog post: [Link](https://prometh.ai/promethai-memory-blog-post-one)
We introduce the following concepts:
- Structured output with Pydantic
- CMD script to process custom PDFs
#### Level 2 - Memory Manager + Metadata management
Scope: Give PDFs to the model and consolidate with the previous user activity and more
Blog post: [Link] (https://www.notion.so/topoteretes/Going-beyond-Langchain-Weaviate-Level-2-towards-Production-98ad7b915139478992c4c4386b5e5886?pvs=4)
Blog post: [Link](https://www.notion.so/topoteretes/Going-beyond-Langchain-Weaviate-Level-2-towards-Production-98ad7b915139478992c4c4386b5e5886?pvs=4)
We introduce the following concepts:
- Long Term Memory -> store and format the data
@ -115,7 +115,7 @@ We introduce the following concepts:
#### Level 3 - Dynamic Graph Memory Manager + DB + Rag Test Manager
Scope: Store the data in N-related stores and test the retrieval with the Rag Test Manager
Blog post: [Link] (https://topoteretes.notion.site/Going-beyond-Langchain-Weaviate-Level-3-towards-production-e62946c272bf412584b12fbbf92d35b0?pvs=4)
Blog post: [Link](https://topoteretes.notion.site/Going-beyond-Langchain-Weaviate-Level-3-towards-production-e62946c272bf412584b12fbbf92d35b0?pvs=4)
- Dynamic Memory Manager -> store the data in N hierarchical stores
- Auto-generation of tests
- Multiple file formats supported
@ -175,7 +175,8 @@ Inspect the results in the DB:
``` select * from test_outputs; ```
Or set up the superset to visualize the results:
Or set up the superset to visualize the results.
The base SQL query is in the example_data folder.

View file

@ -0,0 +1,34 @@
SELECT
ts.id AS test_set_id,
too.id AS test_output_id,
op.id AS operation_id,
ts.user_id AS test_set_user_id,
ts.content AS test_set_content,
ts.created_at AS test_set_created_at,
ts.updated_at AS test_set_updated_at,
too.set_id AS test_output_set_id,
too.user_id AS test_output_user_id,
too.test_set_id AS test_output_test_set_id,
too.operation_id AS test_output_operation_id,
too.test_params AS test_output_test_params,
too.test_result AS test_output_test_result,
too.test_score AS test_output_test_score,
too.test_metric_name AS test_output_test_metric_name,
too.test_query AS test_output_test_query,
too.test_output AS test_output_test_output,
too.test_expected_output AS test_output_test_expected_output,
too.test_context AS test_output_test_context,
too.test_results AS test_output_test_results,
too.created_at AS test_output_created_at,
too.updated_at AS test_output_updated_at,
op.user_id AS operation_user_id,
op.operation_type AS operation_operation_type,
op.operation_params AS operation_operation_params,
op.test_set_id AS operation_test_set_id,
op.created_at AS operation_created_at,
op.updated_at AS operation_updated_at
FROM public.test_sets ts
JOIN public.test_outputs too ON ts.id = too.test_set_id
JOIN public.operations op ON op.id = too.operation_id
where operation_status ="COMPLETED";