Feature/cog 1971 crewai GitHub ingest (#813)
<!-- .github/pull_request_template.md --> ## Description <!-- Provide a clear description of the changes in this PR --> `github_dev_profile.py`: - Main coordinator class for GitHub user data - Extracts basic user information - Connects to commits and comments functionality `github_dev_commits.py`: - Extracts PR commit history - Filters commits by time and quantity - Structures commit data with file changes `github_dev_comments.py`: - Extracts user comments from issues - Filters comments by time and quantity - Structures comment data with issue context `github_ingest.py`: - Transforms GitHub data for Cognee - Categorizes data into technical/soft skill nodes - Provides graph visualization support ## DCO Affirmation I affirm that all code in every commit of this pull request conforms to the terms of the Topoteretes Developer Certificate of Origin. --------- Co-authored-by: Igor Ilic <30923996+dexters1@users.noreply.github.com> Co-authored-by: Hande <159312713+hande-k@users.noreply.github.com> Co-authored-by: Vasilije <8619304+Vasilije1990@users.noreply.github.com> Co-authored-by: Boris <boris@topoteretes.com>
This commit is contained in:
parent
5d4f82fdd4
commit
ad9abb8b76
46 changed files with 1390 additions and 837 deletions
20
README.md
20
README.md
|
|
@ -32,6 +32,14 @@ Build dynamic Agent memory using scalable, modular ECL (Extract, Cognify, Load)
|
|||
|
||||
More on [use-cases](https://docs.cognee.ai/use-cases) and [evals](https://github.com/topoteretes/cognee/tree/main/evals)
|
||||
|
||||
<p align="center">
|
||||
🌐 Available Languages
|
||||
:
|
||||
<a href="community/README.pt.md">🇵🇹 Português</a>
|
||||
·
|
||||
<a href="community/README.zh.md">🇨🇳 [中文]</a>
|
||||
</p>
|
||||
|
||||
<div style="text-align: center">
|
||||
<img src="https://raw.githubusercontent.com/topoteretes/cognee/refs/heads/main/assets/cognee_benefits.png" alt="Why cognee?" width="50%" />
|
||||
</div>
|
||||
|
|
@ -50,7 +58,7 @@ More on [use-cases](https://docs.cognee.ai/use-cases) and [evals](https://github
|
|||
|
||||
## Get Started
|
||||
|
||||
Get started quickly with a Google Colab <a href="https://colab.research.google.com/drive/1g-Qnx6l_ecHZi0IOw23rg0qC4TYvEvWZ?usp=sharing">notebook</a> or <a href="https://github.com/topoteretes/cognee-starter">starter repo</a>
|
||||
Get started quickly with a Google Colab <a href="https://colab.research.google.com/drive/1jHbWVypDgCLwjE71GSXhRL3YxYhCZzG1?usp=sharing">notebook</a> or <a href="https://github.com/topoteretes/cognee-starter">starter repo</a>
|
||||
|
||||
## Contributing
|
||||
Your contributions are at the core of making this a true open source project. Any contributions you make are **greatly appreciated**. See [`CONTRIBUTING.md`](CONTRIBUTING.md) for more information.
|
||||
|
|
@ -116,12 +124,14 @@ Example output:
|
|||
Natural Language Processing (NLP) is a cross-disciplinary and interdisciplinary field that involves computer science and information retrieval. It focuses on the interaction between computers and human language, enabling machines to understand and process natural language.
|
||||
|
||||
```
|
||||
Graph visualization:
|
||||
<a href="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/add-visualization-readme/assets/graph_visualization.html"><img src="assets/graph_visualization.png" width="100%" alt="Graph Visualization"></a>
|
||||
Open in [browser](https://rawcdn.githack.com/topoteretes/cognee/refs/heads/add-visualization-readme/assets/graph_visualization.html).
|
||||
|
||||
For more advanced usage, have a look at our <a href="https://docs.cognee.ai"> documentation</a>.
|
||||
### cognee UI
|
||||
|
||||
You can also cognify your files and query using cognee UI.
|
||||
|
||||
<img src="assets/cognee-ui-2.webp" width="100%" alt="Cognee UI 2"></a>
|
||||
|
||||
Try cognee UI out locally [here](https://docs.cognee.ai/how-to-guides/cognee-ui).
|
||||
|
||||
## Understand our architecture
|
||||
|
||||
|
|
|
|||
BIN
assets/cognee-ui-1.webp
Normal file
BIN
assets/cognee-ui-1.webp
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 878 KiB |
BIN
assets/cognee-ui-2.webp
Normal file
BIN
assets/cognee-ui-2.webp
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 936 KiB |
|
|
@ -1,128 +0,0 @@
|
|||
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<script src="https://d3js.org/d3.v5.min.js"></script>
|
||||
<style>
|
||||
body, html { margin: 0; padding: 0; width: 100%; height: 100%; overflow: hidden; background: linear-gradient(90deg, #101010, #1a1a2e); color: white; font-family: 'Inter', sans-serif; }
|
||||
|
||||
svg { width: 100vw; height: 100vh; display: block; }
|
||||
.links line { stroke: rgba(255, 255, 255, 0.4); stroke-width: 2px; }
|
||||
.nodes circle { stroke: white; stroke-width: 0.5px; filter: drop-shadow(0 0 5px rgba(255,255,255,0.3)); }
|
||||
.node-label { font-size: 5px; font-weight: bold; fill: white; text-anchor: middle; dominant-baseline: middle; font-family: 'Inter', sans-serif; pointer-events: none; }
|
||||
.edge-label { font-size: 3px; fill: rgba(255, 255, 255, 0.7); text-anchor: middle; dominant-baseline: middle; font-family: 'Inter', sans-serif; pointer-events: none; }
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<svg></svg>
|
||||
<script>
|
||||
var nodes = [{"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["text"]}, "type": "DocumentChunk", "text": "Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval.", "chunk_size": 34, "chunk_index": 0, "cut_type": "sentence_end", "id": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220", "color": "#801212", "name": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["name"]}, "type": "Entity", "name": "natural language processing", "description": "An interdisciplinary subfield of computer science and information retrieval.", "ontology_valid": false, "id": "bc338a39-64d6-549a-acec-da60846dd90d", "color": "#f47710"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["name"]}, "type": "EntityType", "name": "concept", "description": "concept", "ontology_valid": false, "id": "dd9713b7-dc20-5101-aad0-1c4216811147", "color": "#6510f4"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["name"]}, "type": "Entity", "name": "information retrieval", "description": "The activity of obtaining information system resources that are relevant to an information need.", "ontology_valid": false, "id": "02bdab9a-0981-518c-a0d4-1684e0329447", "color": "#f47710"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["name"]}, "type": "EntityType", "name": "field", "description": "field", "ontology_valid": false, "id": "0198571b-3e94-50ea-8b9f-19e3a31080c0", "color": "#6510f4"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["name"]}, "type": "Entity", "name": "computer science", "description": "The study of computers and computational systems.", "ontology_valid": false, "id": "6218dbab-eb6a-5759-a864-b3419755ffe0", "color": "#f47710"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["name"]}, "type": "TextDocument", "name": "text_46d2fce36f0f7b6ebc0575e353fdba5c", "raw_data_location": "/Users/handekafkas/Documents/local-code/new-cognee/cognee/cognee/.data_storage/data/text_46d2fce36f0f7b6ebc0575e353fdba5c.txt", "external_metadata": "{}", "mime_type": "text/plain", "id": "c07949fe-5a9f-53b9-ac90-5cb48a8a4303", "color": "#D3D3D3"}, {"version": 1, "topological_rank": 0, "metadata": {"index_fields": ["text"]}, "type": "TextSummary", "text": "Natural language processing (NLP) is a cross-disciplinary area of computer science and information extraction.", "id": "9da41e72-8150-5055-9217-eea49d1bc447", "color": "#1077f4", "name": "9da41e72-8150-5055-9217-eea49d1bc447"}];
|
||||
var links = [{"source": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220", "target": "bc338a39-64d6-549a-acec-da60846dd90d", "relation": "contains"}, {"source": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220", "target": "02bdab9a-0981-518c-a0d4-1684e0329447", "relation": "contains"}, {"source": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220", "target": "6218dbab-eb6a-5759-a864-b3419755ffe0", "relation": "contains"}, {"source": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220", "target": "c07949fe-5a9f-53b9-ac90-5cb48a8a4303", "relation": "is_part_of"}, {"source": "bc338a39-64d6-549a-acec-da60846dd90d", "target": "dd9713b7-dc20-5101-aad0-1c4216811147", "relation": "is_a"}, {"source": "bc338a39-64d6-549a-acec-da60846dd90d", "target": "6218dbab-eb6a-5759-a864-b3419755ffe0", "relation": "is_a_subfield_of"}, {"source": "bc338a39-64d6-549a-acec-da60846dd90d", "target": "02bdab9a-0981-518c-a0d4-1684e0329447", "relation": "is_a_subfield_of"}, {"source": "02bdab9a-0981-518c-a0d4-1684e0329447", "target": "0198571b-3e94-50ea-8b9f-19e3a31080c0", "relation": "is_a"}, {"source": "6218dbab-eb6a-5759-a864-b3419755ffe0", "target": "0198571b-3e94-50ea-8b9f-19e3a31080c0", "relation": "is_a"}, {"source": "9da41e72-8150-5055-9217-eea49d1bc447", "target": "b5b7b6b3-3bb7-5efd-a975-5a01e0d40220", "relation": "made_from"}];
|
||||
|
||||
var svg = d3.select("svg"),
|
||||
width = window.innerWidth,
|
||||
height = window.innerHeight;
|
||||
|
||||
var container = svg.append("g");
|
||||
|
||||
var simulation = d3.forceSimulation(nodes)
|
||||
.force("link", d3.forceLink(links).id(d => d.id).strength(0.1))
|
||||
.force("charge", d3.forceManyBody().strength(-275))
|
||||
.force("center", d3.forceCenter(width / 2, height / 2))
|
||||
.force("x", d3.forceX().strength(0.1).x(width / 2))
|
||||
.force("y", d3.forceY().strength(0.1).y(height / 2));
|
||||
|
||||
var link = container.append("g")
|
||||
.attr("class", "links")
|
||||
.selectAll("line")
|
||||
.data(links)
|
||||
.enter().append("line")
|
||||
.attr("stroke-width", 2);
|
||||
|
||||
var edgeLabels = container.append("g")
|
||||
.attr("class", "edge-labels")
|
||||
.selectAll("text")
|
||||
.data(links)
|
||||
.enter().append("text")
|
||||
.attr("class", "edge-label")
|
||||
.text(d => d.relation);
|
||||
|
||||
var nodeGroup = container.append("g")
|
||||
.attr("class", "nodes")
|
||||
.selectAll("g")
|
||||
.data(nodes)
|
||||
.enter().append("g");
|
||||
|
||||
var node = nodeGroup.append("circle")
|
||||
.attr("r", 13)
|
||||
.attr("fill", d => d.color)
|
||||
.call(d3.drag()
|
||||
.on("start", dragstarted)
|
||||
.on("drag", dragged)
|
||||
.on("end", dragended));
|
||||
|
||||
nodeGroup.append("text")
|
||||
.attr("class", "node-label")
|
||||
.attr("dy", 4)
|
||||
.attr("text-anchor", "middle")
|
||||
.text(d => d.name);
|
||||
|
||||
node.append("title").text(d => JSON.stringify(d));
|
||||
|
||||
simulation.on("tick", function() {
|
||||
link.attr("x1", d => d.source.x)
|
||||
.attr("y1", d => d.source.y)
|
||||
.attr("x2", d => d.target.x)
|
||||
.attr("y2", d => d.target.y);
|
||||
|
||||
edgeLabels
|
||||
.attr("x", d => (d.source.x + d.target.x) / 2)
|
||||
.attr("y", d => (d.source.y + d.target.y) / 2 - 5);
|
||||
|
||||
node.attr("cx", d => d.x)
|
||||
.attr("cy", d => d.y);
|
||||
|
||||
nodeGroup.select("text")
|
||||
.attr("x", d => d.x)
|
||||
.attr("y", d => d.y)
|
||||
.attr("dy", 4)
|
||||
.attr("text-anchor", "middle");
|
||||
});
|
||||
|
||||
svg.call(d3.zoom().on("zoom", function() {
|
||||
container.attr("transform", d3.event.transform);
|
||||
}));
|
||||
|
||||
function dragstarted(d) {
|
||||
if (!d3.event.active) simulation.alphaTarget(0.3).restart();
|
||||
d.fx = d.x;
|
||||
d.fy = d.y;
|
||||
}
|
||||
|
||||
function dragged(d) {
|
||||
d.fx = d3.event.x;
|
||||
d.fy = d3.event.y;
|
||||
}
|
||||
|
||||
function dragended(d) {
|
||||
if (!d3.event.active) simulation.alphaTarget(0);
|
||||
d.fx = null;
|
||||
d.fy = null;
|
||||
}
|
||||
|
||||
window.addEventListener("resize", function() {
|
||||
width = window.innerWidth;
|
||||
height = window.innerHeight;
|
||||
svg.attr("width", width).attr("height", height);
|
||||
simulation.force("center", d3.forceCenter(width / 2, height / 2));
|
||||
simulation.alpha(1).restart();
|
||||
});
|
||||
</script>
|
||||
|
||||
<svg style="position: fixed; bottom: 10px; right: 10px; width: 150px; height: auto; z-index: 9999;" viewBox="0 0 158 44" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path fill-rule="evenodd" clip-rule="evenodd" d="M11.7496 4.92654C7.83308 4.92654 4.8585 7.94279 4.8585 11.3612V14.9304C4.8585 18.3488 7.83308 21.3651 11.7496 21.3651C13.6831 21.3651 15.0217 20.8121 16.9551 19.3543C18.0458 18.5499 19.5331 18.8013 20.3263 19.9072C21.1195 21.0132 20.8717 22.5213 19.781 23.3257C17.3518 25.0851 15.0217 26.2414 11.7 26.2414C5.35425 26.2414 0 21.2646 0 14.9304V11.3612C0 4.97681 5.35425 0.0502739 11.7 0.0502739C15.0217 0.0502739 17.3518 1.2065 19.781 2.96598C20.8717 3.77032 21.1195 5.27843 20.3263 6.38439C19.5331 7.49035 18.0458 7.69144 16.9551 6.93737C15.0217 5.52979 13.6831 4.92654 11.7496 4.92654ZM35.5463 4.92654C31.7289 4.92654 28.6552 8.04333 28.6552 11.8639V14.478C28.6552 18.2986 31.7289 21.4154 35.5463 21.4154C39.3141 21.4154 42.3878 18.2986 42.3878 14.478V11.8639C42.3878 8.04333 39.3141 4.92654 35.5463 4.92654ZM23.7967 11.8639C23.7967 5.32871 29.0518 0 35.5463 0C42.0408 0 47.2463 5.32871 47.2463 11.8639V14.478C47.2463 21.0132 42.0408 26.3419 35.5463 26.3419C29.0518 26.3419 23.7967 21.0635 23.7967 14.478V11.8639ZM63.3091 5.07736C59.4917 5.07736 56.418 8.19415 56.418 12.0147C56.418 15.8353 59.4917 18.9521 63.3091 18.9521C67.1265 18.9521 70.1506 15.8856 70.1506 12.0147C70.1506 8.14388 67.0769 5.07736 63.3091 5.07736ZM51.5595 11.9645C51.5595 5.42925 56.8146 0.150814 63.3091 0.150814C66.0854 0.150814 68.5642 1.10596 70.5968 2.71463L72.4311 0.904876C73.3731 -0.0502693 74.9099 -0.0502693 75.8519 0.904876C76.7938 1.86002 76.7938 3.41841 75.8519 4.37356L73.7201 6.53521C74.5629 8.19414 75.0587 10.0542 75.0587 12.0147C75.0587 18.4997 69.8532 23.8284 63.3587 23.8284C63.3091 23.8284 63.2099 23.8284 63.1603 23.8284H58.0044C57.1616 23.8284 56.4675 24.5322 56.4675 25.3868C56.4675 26.2414 57.1616 26.9452 58.0044 26.9452H64.6476H66.7794C68.5146 26.9452 70.3489 27.4479 71.7866 28.6041C73.2739 29.8106 74.2159 31.5701 74.4142 33.7317C74.7116 37.6026 72.0345 40.2166 69.8532 41.0713L63.8048 43.7859C62.5654 44.3389 61.1277 43.7859 60.6319 42.5291C60.0866 41.2723 60.6319 39.8648 61.8714 39.3118L68.0188 36.5972C68.0684 36.5972 68.118 36.5469 68.1675 36.5469C68.4154 36.4463 68.8616 36.1447 69.2087 35.6923C69.5061 35.2398 69.7044 34.7371 69.6548 34.1339C69.6053 33.229 69.2582 32.7263 68.8616 32.4247C68.4154 32.0728 67.7214 31.8214 66.8786 31.8214H58.2027C58.1531 31.8214 58.1531 31.8214 58.1035 31.8214H58.054C54.534 31.8214 51.6586 28.956 51.6586 25.3868C51.6586 23.0743 52.8485 21.0635 54.6828 19.9072C52.6997 17.7959 51.5595 15.031 51.5595 11.9645ZM90.8736 5.07736C87.0562 5.07736 83.9824 8.19415 83.9824 12.0147V23.9289C83.9824 25.2862 82.8917 26.3922 81.5532 26.3922C80.2146 26.3922 79.1239 25.2862 79.1239 23.9289V11.9645C79.1239 5.42925 84.379 0.150814 90.824 0.150814C97.2689 0.150814 102.524 5.42925 102.524 11.9645V23.8786C102.524 25.2359 101.433 26.3419 100.095 26.3419C98.7562 26.3419 97.6655 25.2359 97.6655 23.8786V11.9645C97.7647 8.14387 94.6414 5.07736 90.8736 5.07736ZM119.43 5.07736C115.513 5.07736 112.39 8.24441 112.39 12.065V14.5785C112.39 18.4494 115.513 21.5662 119.43 21.5662C120.768 21.5662 122.057 21.164 123.098 20.5105C124.238 19.8067 125.726 20.1586 126.42 21.3148C127.114 22.4711 126.767 23.9792 125.627 24.683C123.842 25.7889 121.71 26.4425 119.43 26.4425C112.885 26.4425 107.581 21.1137 107.581 14.5785V12.065C107.581 5.47952 112.935 0.201088 119.43 0.201088C125.032 0.201088 129.692 4.07194 130.931 9.3001L131.427 11.3612L121.115 15.584C119.876 16.0867 118.488 15.4834 117.942 14.2266C117.447 12.9699 118.041 11.5623 119.281 11.0596L125.478 8.54604C124.238 6.43466 122.008 5.07736 119.43 5.07736ZM146.003 5.07736C142.086 5.07736 138.963 8.24441 138.963 12.065V14.5785C138.963 18.4494 142.086 21.5662 146.003 21.5662C147.341 21.5662 148.63 21.164 149.671 20.5105C150.217 20.1586 150.663 19.8067 151.109 19.304C152.001 18.2986 153.538 18.2483 154.53 19.2034C155.521 20.1083 155.571 21.6667 154.629 22.6721C153.935 23.4262 153.092 24.13 152.2 24.683C150.415 25.7889 148.283 26.4425 146.003 26.4425C139.458 26.4425 134.154 21.1137 134.154 14.5785V12.065C134.154 5.47952 139.508 0.201088 146.003 0.201088C151.605 0.201088 156.265 4.07194 157.504 9.3001L158 11.3612L147.688 15.584C146.449 16.0867 145.061 15.4834 144.515 14.2266C144.019 12.9699 144.614 11.5623 145.854 11.0596L152.051 8.54604C150.762 6.43466 148.58 5.07736 146.003 5.07736Z" fill="white"/>
|
||||
</svg>
|
||||
</body>
|
||||
</html>
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 365 KiB |
|
|
@ -1,18 +1,17 @@
|
|||
import os
|
||||
import pathlib
|
||||
import asyncio
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from uuid import NAMESPACE_OID, uuid5
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from cognee.modules.observability.get_observe import get_observe
|
||||
|
||||
from cognee.api.v1.search import SearchType, search
|
||||
from cognee.api.v1.visualize.visualize import visualize_graph
|
||||
from cognee.base_config import get_base_config
|
||||
from cognee.modules.cognify.config import get_cognify_config
|
||||
from cognee.modules.pipelines import run_tasks
|
||||
from cognee.modules.pipelines.tasks.task import Task
|
||||
from cognee.modules.users.methods import get_default_user
|
||||
from cognee.shared.data_models import KnowledgeGraph, MonitoringTool
|
||||
from cognee.shared.utils import render_graph
|
||||
from cognee.shared.data_models import KnowledgeGraph
|
||||
from cognee.tasks.documents import classify_documents, extract_chunks_from_documents
|
||||
from cognee.tasks.graph import extract_graph_from_data
|
||||
from cognee.tasks.ingestion import ingest_data
|
||||
|
|
@ -22,11 +21,7 @@ from cognee.tasks.storage import add_data_points
|
|||
from cognee.tasks.summarization import summarize_text
|
||||
from cognee.infrastructure.llm import get_max_chunk_tokens
|
||||
|
||||
monitoring = get_base_config().monitoring_tool
|
||||
|
||||
if monitoring == MonitoringTool.LANGFUSE:
|
||||
from langfuse.decorators import observe
|
||||
|
||||
observe = get_observe()
|
||||
|
||||
logger = get_logger("code_graph_pipeline")
|
||||
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
import os
|
||||
from typing import Optional
|
||||
from functools import lru_cache
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
from cognee.root_dir import get_absolute_path
|
||||
from cognee.shared.data_models import MonitoringTool
|
||||
from cognee.modules.observability.observers import Observer
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class BaseConfig(BaseSettings):
|
||||
data_root_directory: str = get_absolute_path(".data_storage")
|
||||
monitoring_tool: object = MonitoringTool.LANGFUSE
|
||||
monitoring_tool: object = Observer.LANGFUSE
|
||||
graphistry_username: Optional[str] = os.getenv("GRAPHISTRY_USERNAME")
|
||||
graphistry_password: Optional[str] = os.getenv("GRAPHISTRY_PASSWORD")
|
||||
langfuse_public_key: Optional[str] = os.getenv("LANGFUSE_PUBLIC_KEY")
|
||||
|
|
|
|||
|
|
@ -0,0 +1,5 @@
|
|||
from .github_dev_profile import GitHubDevProfile
|
||||
from .github_dev_comments import GitHubDevComments
|
||||
from .github_dev_commits import GitHubDevCommits
|
||||
|
||||
__all__ = ["GitHubDevProfile", "GitHubDevComments", "GitHubDevCommits"]
|
||||
|
|
@ -0,0 +1,105 @@
|
|||
from github import Github
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
|
||||
class GitHubDevComments:
|
||||
"""Class for working with a GitHub developer's comments."""
|
||||
|
||||
def __init__(
|
||||
self, profile, days=30, issues_limit=10, max_comments=5, include_issue_details=True
|
||||
):
|
||||
"""Initialize with a GitHubDevProfile instance and default parameters."""
|
||||
self.profile = profile
|
||||
self.days = days
|
||||
self.issues_limit = issues_limit
|
||||
self.max_comments = max_comments
|
||||
self.include_issue_details = include_issue_details
|
||||
|
||||
def get_issue_comments(self):
|
||||
"""Fetches comments made by the user on issues across repositories within timeframe."""
|
||||
if not self.profile.user:
|
||||
return None
|
||||
|
||||
date_filter = self._get_date_filter(self.days)
|
||||
query = f"commenter:{self.profile.username} is:issue{date_filter}"
|
||||
|
||||
return self._get_comments_from_search(query)
|
||||
|
||||
def get_repo_issue_comments(self, repo_name):
|
||||
"""Fetches comments made by the user on issues in a specific repository within timeframe."""
|
||||
if not self.profile.user:
|
||||
return None
|
||||
|
||||
date_filter = self._get_date_filter(self.days)
|
||||
query = f"repo:{repo_name} is:issue commenter:{self.profile.username}{date_filter}"
|
||||
self.profile.github.get_repo(repo_name)
|
||||
|
||||
return self._get_comments_from_search(query)
|
||||
|
||||
def set_limits(
|
||||
self, days=None, issues_limit=None, max_comments=None, include_issue_details=None
|
||||
):
|
||||
"""Sets all search parameters for comment searches."""
|
||||
if days is not None:
|
||||
self.days = days
|
||||
if issues_limit is not None:
|
||||
self.issues_limit = issues_limit
|
||||
if max_comments is not None:
|
||||
self.max_comments = max_comments
|
||||
if include_issue_details is not None:
|
||||
self.include_issue_details = include_issue_details
|
||||
|
||||
def _get_date_filter(self, days):
|
||||
"""Creates a date filter string for GitHub search queries."""
|
||||
if not days:
|
||||
return ""
|
||||
|
||||
date_limit = (datetime.now() - timedelta(days=days)).strftime("%Y-%m-%d")
|
||||
return f" created:>={date_limit}"
|
||||
|
||||
def _get_comments_from_search(self, query):
|
||||
"""Retrieves comments based on a search query for issues."""
|
||||
try:
|
||||
issues = list(self.profile.github.search_issues(query))
|
||||
except Exception as e:
|
||||
print(f"Error executing search query: {e}")
|
||||
return []
|
||||
|
||||
if not issues:
|
||||
return []
|
||||
|
||||
all_comments = [
|
||||
self._extract_comment_data(issue, comment)
|
||||
for issue in issues[: self.issues_limit]
|
||||
for comment in self._get_user_comments_from_issue(issue)
|
||||
]
|
||||
|
||||
return all_comments
|
||||
|
||||
def _get_user_comments_from_issue(self, issue):
|
||||
"""Gets comments made by the user on a specific issue."""
|
||||
try:
|
||||
all_comments = list(issue.get_comments())
|
||||
user_comments = [c for c in all_comments if c.user.login == self.profile.username]
|
||||
return user_comments[: self.max_comments]
|
||||
except Exception as e:
|
||||
print(f"Error getting comments from issue #{issue.number}: {e}")
|
||||
return []
|
||||
|
||||
def _extract_comment_data(self, issue, comment):
|
||||
"""Creates a structured data object from a comment."""
|
||||
comment_data = {
|
||||
"repo": issue.repository.name,
|
||||
"issue_number": issue.number,
|
||||
"comment_id": comment.id,
|
||||
"body": comment.body,
|
||||
"created_at": comment.created_at,
|
||||
"updated_at": comment.updated_at,
|
||||
"html_url": comment.html_url,
|
||||
"issue_url": issue.html_url,
|
||||
"author_association": getattr(comment, "author_association", "UNKNOWN"),
|
||||
"issue_title": issue.title,
|
||||
"issue_state": issue.state,
|
||||
}
|
||||
|
||||
return comment_data
|
||||
|
|
@ -0,0 +1,195 @@
|
|||
from github import Github
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
|
||||
class GitHubDevCommits:
|
||||
"""Class for working with a GitHub developer's commits in pull requests."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
profile,
|
||||
days=30,
|
||||
prs_limit=10,
|
||||
commits_per_pr=5,
|
||||
include_files=False,
|
||||
skip_no_diff=False,
|
||||
):
|
||||
"""Initialize with a GitHubDevProfile instance and default parameters."""
|
||||
self.profile = profile
|
||||
self.days = days
|
||||
self.prs_limit = prs_limit
|
||||
self.commits_per_pr = commits_per_pr
|
||||
self.include_files = include_files
|
||||
self.skip_no_diff = skip_no_diff
|
||||
self.file_keys = ["filename", "status", "additions", "deletions", "changes", "diff"]
|
||||
|
||||
def get_user_commits(self):
|
||||
"""Fetches user's most recent commits from pull requests."""
|
||||
if not self.profile.user:
|
||||
return None
|
||||
|
||||
commits = self._collect_user_pr_commits()
|
||||
return {"user": self.profile.get_user_info(), "commits": commits}
|
||||
|
||||
def get_user_file_changes(self):
|
||||
"""Returns a flat list of file changes with associated commit information from PRs."""
|
||||
if not self.profile.user:
|
||||
return None
|
||||
|
||||
all_files = []
|
||||
commits = self._collect_user_pr_commits(include_files=True)
|
||||
|
||||
for commit in commits:
|
||||
if "files" not in commit:
|
||||
continue
|
||||
|
||||
commit_info = {
|
||||
"repo": commit["repo"],
|
||||
"commit_sha": commit["sha"],
|
||||
"commit_message": commit["message"],
|
||||
"commit_date": commit["date"],
|
||||
"commit_url": commit["url"],
|
||||
"pr_number": commit.get("pr_number"),
|
||||
"pr_title": commit.get("pr_title"),
|
||||
}
|
||||
|
||||
file_changes = []
|
||||
for file in commit["files"]:
|
||||
file_data = {key: file.get(key) for key in self.file_keys}
|
||||
file_changes.append({**file_data, **commit_info})
|
||||
|
||||
all_files.extend(file_changes)
|
||||
|
||||
return all_files
|
||||
|
||||
def set_options(
|
||||
self, days=None, prs_limit=None, commits_per_pr=None, include_files=None, skip_no_diff=None
|
||||
):
|
||||
"""Sets commit search parameters."""
|
||||
if days is not None:
|
||||
self.days = days
|
||||
if prs_limit is not None:
|
||||
self.prs_limit = prs_limit
|
||||
if commits_per_pr is not None:
|
||||
self.commits_per_pr = commits_per_pr
|
||||
if include_files is not None:
|
||||
self.include_files = include_files
|
||||
if skip_no_diff is not None:
|
||||
self.skip_no_diff = skip_no_diff
|
||||
|
||||
def _get_date_filter(self, days):
|
||||
"""Creates a date filter string for GitHub search queries."""
|
||||
if not days:
|
||||
return ""
|
||||
|
||||
date_limit = (datetime.now() - timedelta(days=days)).strftime("%Y-%m-%d")
|
||||
return f" created:>={date_limit}"
|
||||
|
||||
def _collect_user_pr_commits(self, include_files=None):
|
||||
"""Collects and sorts a user's recent commits from pull requests they authored."""
|
||||
include_files = include_files if include_files is not None else self.include_files
|
||||
|
||||
prs = self._get_user_prs()
|
||||
|
||||
if not prs:
|
||||
return []
|
||||
|
||||
all_commits = []
|
||||
for pr in prs[: self.prs_limit]:
|
||||
pr_commits = self._get_commits_from_pr(pr, include_files)
|
||||
all_commits.extend(pr_commits)
|
||||
|
||||
sorted_commits = sorted(all_commits, key=lambda x: x["date"], reverse=True)
|
||||
return sorted_commits
|
||||
|
||||
def _get_user_prs(self):
|
||||
"""Gets pull requests authored by the user."""
|
||||
date_filter = self._get_date_filter(self.days)
|
||||
query = f"author:{self.profile.username} is:pr is:merged{date_filter}"
|
||||
|
||||
try:
|
||||
return list(self.profile.github.search_issues(query))
|
||||
except Exception as e:
|
||||
print(f"Error searching for PRs: {e}")
|
||||
return []
|
||||
|
||||
def _get_commits_from_pr(self, pr_issue, include_files=None):
|
||||
"""Gets commits by the user from a specific PR."""
|
||||
include_files = include_files if include_files is not None else self.include_files
|
||||
|
||||
pr_info = self._get_pull_request_object(pr_issue)
|
||||
if not pr_info:
|
||||
return []
|
||||
|
||||
repo_name, pr = pr_info
|
||||
|
||||
all_commits = self._get_all_pr_commits(pr, pr_issue.number)
|
||||
if not all_commits:
|
||||
return []
|
||||
|
||||
user_commits = [
|
||||
c
|
||||
for c in all_commits
|
||||
if c.author and hasattr(c.author, "login") and c.author.login == self.profile.username
|
||||
]
|
||||
|
||||
commit_data = [
|
||||
self._extract_commit_data(commit, repo_name, pr_issue, include_files)
|
||||
for commit in user_commits[: self.commits_per_pr]
|
||||
]
|
||||
|
||||
return commit_data
|
||||
|
||||
def _get_pull_request_object(self, pr_issue):
|
||||
"""Gets repository and pull request objects from an issue."""
|
||||
try:
|
||||
repo_name = pr_issue.repository.full_name
|
||||
repo = self.profile.github.get_repo(repo_name)
|
||||
pr = repo.get_pull(pr_issue.number)
|
||||
return (repo_name, pr)
|
||||
except Exception as e:
|
||||
print(f"Error accessing PR #{pr_issue.number}: {e}")
|
||||
return None
|
||||
|
||||
def _get_all_pr_commits(self, pr, pr_number):
|
||||
"""Gets all commits from a pull request."""
|
||||
try:
|
||||
return list(pr.get_commits())
|
||||
except Exception as e:
|
||||
print(f"Error retrieving commits from PR #{pr_number}: {e}")
|
||||
return None
|
||||
|
||||
def _extract_commit_data(self, commit, repo_name, pr_issue, include_files=None):
|
||||
"""Extracts relevant data from a commit object within a PR context."""
|
||||
commit_data = {
|
||||
"repo": repo_name,
|
||||
"sha": commit.sha,
|
||||
"message": commit.commit.message,
|
||||
"date": commit.commit.author.date,
|
||||
"url": commit.html_url,
|
||||
"pr_number": pr_issue.number,
|
||||
"pr_title": pr_issue.title,
|
||||
"pr_url": pr_issue.html_url,
|
||||
}
|
||||
|
||||
include_files = include_files if include_files is not None else self.include_files
|
||||
|
||||
if include_files:
|
||||
commit_data["files"] = self._extract_commit_files(commit)
|
||||
|
||||
return commit_data
|
||||
|
||||
def _extract_commit_files(self, commit):
|
||||
"""Extracts files changed in a commit, including diffs."""
|
||||
files = []
|
||||
for file in commit.files:
|
||||
if self.skip_no_diff and not file.patch:
|
||||
continue
|
||||
|
||||
file_data = {key: getattr(file, key, None) for key in self.file_keys}
|
||||
|
||||
if "diff" in self.file_keys:
|
||||
file_data["diff"] = file.patch if file.patch else "No diff available for this file"
|
||||
|
||||
files.append(file_data)
|
||||
return files
|
||||
|
|
@ -0,0 +1,116 @@
|
|||
from github import Github
|
||||
from datetime import datetime
|
||||
import json
|
||||
import os
|
||||
from cognee.complex_demos.crewai_demo.src.crewai_demo.github_dev_comments import GitHubDevComments
|
||||
from cognee.complex_demos.crewai_demo.src.crewai_demo.github_dev_commits import GitHubDevCommits
|
||||
|
||||
|
||||
class GitHubDevProfile:
|
||||
"""Class for working with a GitHub developer's profile, commits, and activity."""
|
||||
|
||||
def __init__(self, username, token):
|
||||
"""Initialize with a username and GitHub API token."""
|
||||
self.github = Github(token) if token else Github()
|
||||
self.token = token
|
||||
self.username = username
|
||||
self.user = self._get_user(username)
|
||||
self.user_info = self._extract_user_info() if self.user else None
|
||||
self.comments = GitHubDevComments(self) if self.user else None
|
||||
self.commits = GitHubDevCommits(self) if self.user else None
|
||||
|
||||
def get_user_info(self):
|
||||
"""Returns the cached user information."""
|
||||
return self.user_info
|
||||
|
||||
def get_user_repos(self, limit=None):
|
||||
"""Returns a list of user's repositories with limit."""
|
||||
if not self.user:
|
||||
return []
|
||||
|
||||
repos = list(self.user.get_repos())
|
||||
if limit:
|
||||
repos = repos[:limit]
|
||||
return repos
|
||||
|
||||
def get_user_commits(self, days=30, prs_limit=5, commits_per_pr=3, include_files=False):
|
||||
"""Fetches user's most recent commits from pull requests."""
|
||||
if not self.commits:
|
||||
return None
|
||||
|
||||
self.commits.set_options(
|
||||
days=days,
|
||||
prs_limit=prs_limit,
|
||||
commits_per_pr=commits_per_pr,
|
||||
include_files=include_files,
|
||||
)
|
||||
|
||||
return self.commits.get_user_commits()
|
||||
|
||||
def get_user_file_changes(self, days=30, prs_limit=5, commits_per_pr=3, skip_no_diff=True):
|
||||
"""Returns a flat list of file changes from PRs with associated commit information."""
|
||||
if not self.commits:
|
||||
return None
|
||||
|
||||
self.commits.set_options(
|
||||
days=days,
|
||||
prs_limit=prs_limit,
|
||||
commits_per_pr=commits_per_pr,
|
||||
include_files=True,
|
||||
skip_no_diff=skip_no_diff,
|
||||
)
|
||||
|
||||
return self.commits.get_user_file_changes()
|
||||
|
||||
def get_issue_comments(
|
||||
self, days=30, issues_limit=10, max_comments=5, include_issue_details=True
|
||||
):
|
||||
"""Fetches comments made by the user on issues across repositories within specified timeframe."""
|
||||
if not self.comments:
|
||||
return None
|
||||
|
||||
self.comments.set_limits(
|
||||
days=days,
|
||||
issues_limit=issues_limit,
|
||||
max_comments=max_comments,
|
||||
include_issue_details=include_issue_details,
|
||||
)
|
||||
|
||||
return self.comments.get_issue_comments()
|
||||
|
||||
def get_repo_issue_comments(
|
||||
self, repo_name, days=30, issues_limit=10, max_comments=5, include_issue_details=True
|
||||
):
|
||||
"""Fetches comments made by the user on issues in a specific repository within timeframe."""
|
||||
if not self.user or not self.comments:
|
||||
return None
|
||||
|
||||
self.comments.set_limits(
|
||||
days=days,
|
||||
issues_limit=issues_limit,
|
||||
max_comments=max_comments,
|
||||
include_issue_details=include_issue_details,
|
||||
)
|
||||
|
||||
return self.comments.get_repo_issue_comments(repo_name)
|
||||
|
||||
def _get_user(self, username):
|
||||
"""Fetches a GitHub user object."""
|
||||
try:
|
||||
return self.github.get_user(username)
|
||||
except Exception as e:
|
||||
print(f"Error connecting to GitHub API: {e}")
|
||||
return None
|
||||
|
||||
def _extract_user_info(self):
|
||||
"""Extracts basic information from a GitHub user object."""
|
||||
return {
|
||||
"login": self.user.login,
|
||||
"name": self.user.name,
|
||||
"bio": self.user.bio,
|
||||
"company": self.user.company,
|
||||
"location": self.user.location,
|
||||
"public_repos": self.user.public_repos,
|
||||
"followers": self.user.followers,
|
||||
"following": self.user.following,
|
||||
}
|
||||
|
|
@ -0,0 +1,137 @@
|
|||
import json
|
||||
import asyncio
|
||||
import cognee
|
||||
from cognee.complex_demos.crewai_demo.src.crewai_demo.github_dev_profile import GitHubDevProfile
|
||||
|
||||
|
||||
def get_github_profile_data(
|
||||
username, token=None, days=30, prs_limit=5, commits_per_pr=3, issues_limit=5, max_comments=3
|
||||
):
|
||||
"""Fetches comprehensive GitHub profile data including user info, commits from PRs, and comments."""
|
||||
token = token or ""
|
||||
profile = GitHubDevProfile(username, token)
|
||||
|
||||
if not profile.user:
|
||||
return None
|
||||
|
||||
commits_result = profile.get_user_commits(
|
||||
days=days, prs_limit=prs_limit, commits_per_pr=commits_per_pr, include_files=True
|
||||
)
|
||||
comments = profile.get_issue_comments(
|
||||
days=days, issues_limit=issues_limit, max_comments=max_comments, include_issue_details=True
|
||||
)
|
||||
|
||||
return {
|
||||
"user": profile.get_user_info(),
|
||||
"commits": commits_result["commits"] if commits_result else [],
|
||||
"comments": comments or [],
|
||||
}
|
||||
|
||||
|
||||
def get_github_file_changes(
|
||||
username, token=None, days=30, prs_limit=5, commits_per_pr=3, skip_no_diff=True
|
||||
):
|
||||
"""Fetches a flat list of file changes from PRs with associated commit information for a GitHub user."""
|
||||
token = token or ""
|
||||
profile = GitHubDevProfile(username, token)
|
||||
|
||||
if not profile.user:
|
||||
return None
|
||||
|
||||
file_changes = profile.get_user_file_changes(
|
||||
days=days, prs_limit=prs_limit, commits_per_pr=commits_per_pr, skip_no_diff=skip_no_diff
|
||||
)
|
||||
|
||||
return {"user": profile.get_user_info(), "file_changes": file_changes or []}
|
||||
|
||||
|
||||
def get_github_data_for_cognee(
|
||||
username,
|
||||
token=None,
|
||||
days=30,
|
||||
prs_limit=5,
|
||||
commits_per_pr=3,
|
||||
issues_limit=5,
|
||||
max_comments=3,
|
||||
skip_no_diff=True,
|
||||
):
|
||||
"""Fetches enriched GitHub data for a user with PR file changes and comments combined with user data."""
|
||||
token = token or ""
|
||||
profile = GitHubDevProfile(username, token)
|
||||
|
||||
if not profile.user:
|
||||
return None
|
||||
|
||||
user_info = profile.get_user_info()
|
||||
|
||||
file_changes = profile.get_user_file_changes(
|
||||
days=days, prs_limit=prs_limit, commits_per_pr=commits_per_pr, skip_no_diff=skip_no_diff
|
||||
)
|
||||
|
||||
enriched_file_changes = []
|
||||
if file_changes:
|
||||
enriched_file_changes = [item | user_info for item in file_changes]
|
||||
|
||||
comments = profile.get_issue_comments(
|
||||
days=days, issues_limit=issues_limit, max_comments=max_comments, include_issue_details=True
|
||||
)
|
||||
|
||||
enriched_comments = []
|
||||
if comments:
|
||||
enriched_comments = [comment | user_info for comment in comments]
|
||||
|
||||
return {"user": user_info, "file_changes": enriched_file_changes, "comments": enriched_comments}
|
||||
|
||||
|
||||
async def cognify_github_profile(username, token=None):
|
||||
"""Ingests GitHub data into Cognee with soft and technical node sets."""
|
||||
github_data = get_github_data_for_cognee(username=username, token=token)
|
||||
if not github_data:
|
||||
return False
|
||||
|
||||
await cognee.prune.prune_data()
|
||||
await cognee.prune.prune_system(metadata=True)
|
||||
|
||||
await cognee.add(json.dumps(github_data["user"], default=str), node_set=["soft", "technical"])
|
||||
|
||||
for comment in github_data["comments"]:
|
||||
await cognee.add("Comment: " + json.dumps(comment, default=str), node_set=["soft"])
|
||||
|
||||
for file_change in github_data["file_changes"]:
|
||||
await cognee.add(
|
||||
"File Change: " + json.dumps(file_change, default=str), node_set=["technical"]
|
||||
)
|
||||
|
||||
await cognee.cognify()
|
||||
return True
|
||||
|
||||
|
||||
async def main(username):
|
||||
"""Main function for testing Cognee ingest."""
|
||||
import os
|
||||
import dotenv
|
||||
from cognee.api.v1.visualize.visualize import visualize_graph
|
||||
|
||||
dotenv.load_dotenv()
|
||||
token = os.getenv("GITHUB_TOKEN")
|
||||
|
||||
await cognify_github_profile(username, token)
|
||||
|
||||
# success = await cognify_github_profile(username, token)
|
||||
|
||||
# if success:
|
||||
# visualization_path = os.path.join(os.path.dirname(__file__), "./.artifacts/github_graph.html")
|
||||
# await visualize_graph(visualization_path)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import os
|
||||
import dotenv
|
||||
|
||||
dotenv.load_dotenv()
|
||||
|
||||
username = ""
|
||||
asyncio.run(main(username))
|
||||
# token = os.getenv("GITHUB_TOKEN")
|
||||
# github_data = get_github_data_for_cognee(username=username, token=token)
|
||||
# print(json.dumps(github_data, indent=2, default=str))
|
||||
|
|
@ -6,8 +6,9 @@ from chromadb import AsyncHttpClient, Settings
|
|||
from cognee.exceptions import InvalidValueError
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from cognee.modules.storage.utils import get_own_properties
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from cognee.infrastructure.engine import DataPoint
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from cognee.infrastructure.databases.vector.exceptions import CollectionNotFoundError
|
||||
from cognee.infrastructure.databases.vector.models.ScoredResult import ScoredResult
|
||||
|
||||
from ..embeddings.EmbeddingEngine import EmbeddingEngine
|
||||
|
|
@ -108,9 +109,7 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
return await self.embedding_engine.embed_text(data)
|
||||
|
||||
async def has_collection(self, collection_name: str) -> bool:
|
||||
client = await self.get_connection()
|
||||
collections = await client.list_collections()
|
||||
# In ChromaDB v0.6.0, list_collections returns collection names directly
|
||||
collections = await self.get_collection_names()
|
||||
return collection_name in collections
|
||||
|
||||
async def create_collection(self, collection_name: str, payload_schema=None):
|
||||
|
|
@ -119,13 +118,17 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
if not await self.has_collection(collection_name):
|
||||
await client.create_collection(name=collection_name, metadata={"hnsw:space": "cosine"})
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: list[DataPoint]):
|
||||
client = await self.get_connection()
|
||||
|
||||
async def get_collection(self, collection_name: str) -> AsyncHttpClient:
|
||||
if not await self.has_collection(collection_name):
|
||||
await self.create_collection(collection_name)
|
||||
raise CollectionNotFoundError(f"Collection '{collection_name}' not found!")
|
||||
|
||||
collection = await client.get_collection(collection_name)
|
||||
client = await self.get_connection()
|
||||
return await client.get_collection(collection_name)
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: list[DataPoint]):
|
||||
await self.create_collection(collection_name)
|
||||
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
texts = [DataPoint.get_embeddable_data(data_point) for data_point in data_points]
|
||||
embeddings = await self.embed_data(texts)
|
||||
|
|
@ -161,8 +164,7 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
|
||||
async def retrieve(self, collection_name: str, data_point_ids: list[str]):
|
||||
"""Retrieve data points by their IDs from a collection."""
|
||||
client = await self.get_connection()
|
||||
collection = await client.get_collection(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
results = await collection.get(ids=data_point_ids, include=["metadatas"])
|
||||
|
||||
return [
|
||||
|
|
@ -174,62 +176,12 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
for id, metadata in zip(results["ids"], results["metadatas"])
|
||||
]
|
||||
|
||||
async def get_distance_from_collection_elements(
|
||||
self, collection_name: str, query_text: str = None, query_vector: List[float] = None
|
||||
):
|
||||
"""Calculate distance between query and all elements in a collection."""
|
||||
if query_text is None and query_vector is None:
|
||||
raise InvalidValueError(message="One of query_text or query_vector must be provided!")
|
||||
|
||||
if query_text and not query_vector:
|
||||
query_vector = (await self.embedding_engine.embed_text([query_text]))[0]
|
||||
|
||||
client = await self.get_connection()
|
||||
try:
|
||||
collection = await client.get_collection(collection_name)
|
||||
|
||||
collection_count = await collection.count()
|
||||
|
||||
results = await collection.query(
|
||||
query_embeddings=[query_vector],
|
||||
include=["metadatas", "distances"],
|
||||
n_results=collection_count,
|
||||
)
|
||||
|
||||
result_values = []
|
||||
for i, (id, metadata, distance) in enumerate(
|
||||
zip(results["ids"][0], results["metadatas"][0], results["distances"][0])
|
||||
):
|
||||
result_values.append(
|
||||
{
|
||||
"id": parse_id(id),
|
||||
"payload": restore_data_from_chroma(metadata),
|
||||
"_distance": distance,
|
||||
}
|
||||
)
|
||||
|
||||
normalized_values = normalize_distances(result_values)
|
||||
|
||||
scored_results = []
|
||||
for i, result in enumerate(result_values):
|
||||
scored_results.append(
|
||||
ScoredResult(
|
||||
id=result["id"],
|
||||
payload=result["payload"],
|
||||
score=normalized_values[i],
|
||||
)
|
||||
)
|
||||
|
||||
return scored_results
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: str = None,
|
||||
query_vector: List[float] = None,
|
||||
limit: int = 5,
|
||||
limit: int = 15,
|
||||
with_vector: bool = False,
|
||||
normalized: bool = True,
|
||||
):
|
||||
|
|
@ -241,8 +193,10 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
query_vector = (await self.embedding_engine.embed_text([query_text]))[0]
|
||||
|
||||
try:
|
||||
client = await self.get_connection()
|
||||
collection = await client.get_collection(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
if limit == 0:
|
||||
limit = await collection.count()
|
||||
|
||||
results = await collection.query(
|
||||
query_embeddings=[query_vector],
|
||||
|
|
@ -296,8 +250,7 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
"""Perform multiple searches in a single request for efficiency."""
|
||||
query_vectors = await self.embed_data(query_texts)
|
||||
|
||||
client = await self.get_connection()
|
||||
collection = await client.get_collection(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
results = await collection.query(
|
||||
query_embeddings=query_vectors,
|
||||
|
|
@ -346,15 +299,14 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
|
||||
async def delete_data_points(self, collection_name: str, data_point_ids: list[str]):
|
||||
"""Remove data points from a collection by their IDs."""
|
||||
client = await self.get_connection()
|
||||
collection = await client.get_collection(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
await collection.delete(ids=data_point_ids)
|
||||
return True
|
||||
|
||||
async def prune(self):
|
||||
"""Delete all collections in the ChromaDB database."""
|
||||
client = await self.get_connection()
|
||||
collections = await client.list_collections()
|
||||
collections = await self.list_collections()
|
||||
for collection_name in collections:
|
||||
await client.delete_collection(collection_name)
|
||||
return True
|
||||
|
|
@ -362,4 +314,8 @@ class ChromaDBAdapter(VectorDBInterface):
|
|||
async def get_collection_names(self):
|
||||
"""Get a list of all collection names in the database."""
|
||||
client = await self.get_connection()
|
||||
return await client.list_collections()
|
||||
collections = await client.list_collections()
|
||||
return [
|
||||
collection.name if hasattr(collection, "name") else collection["name"]
|
||||
for collection in collections
|
||||
]
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
import asyncio
|
||||
from typing import Generic, List, Optional, TypeVar, Union, get_args, get_origin, get_type_hints
|
||||
|
||||
import lancedb
|
||||
from lancedb.pydantic import LanceModel, Vector
|
||||
from pydantic import BaseModel
|
||||
|
|
@ -76,9 +75,14 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
exist_ok=True,
|
||||
)
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: list[DataPoint]):
|
||||
connection = await self.get_connection()
|
||||
async def get_collection(self, collection_name: str):
|
||||
if not await self.has_collection(collection_name):
|
||||
raise CollectionNotFoundError(f"Collection '{collection_name}' not found!")
|
||||
|
||||
connection = await self.get_connection()
|
||||
return await connection.open_table(collection_name)
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: list[DataPoint]):
|
||||
payload_schema = type(data_points[0])
|
||||
|
||||
if not await self.has_collection(collection_name):
|
||||
|
|
@ -87,7 +91,7 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
payload_schema,
|
||||
)
|
||||
|
||||
collection = await connection.open_table(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
data_vectors = await self.embed_data(
|
||||
[DataPoint.get_embeddable_data(data_point) for data_point in data_points]
|
||||
|
|
@ -125,8 +129,7 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
)
|
||||
|
||||
async def retrieve(self, collection_name: str, data_point_ids: list[str]):
|
||||
connection = await self.get_connection()
|
||||
collection = await connection.open_table(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
if len(data_point_ids) == 1:
|
||||
results = await collection.query().where(f"id = '{data_point_ids[0]}'").to_pandas()
|
||||
|
|
@ -142,48 +145,12 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
for result in results.to_dict("index").values()
|
||||
]
|
||||
|
||||
async def get_distance_from_collection_elements(
|
||||
self, collection_name: str, query_text: str = None, query_vector: List[float] = None
|
||||
):
|
||||
if query_text is None and query_vector is None:
|
||||
raise InvalidValueError(message="One of query_text or query_vector must be provided!")
|
||||
|
||||
if query_text and not query_vector:
|
||||
query_vector = (await self.embedding_engine.embed_text([query_text]))[0]
|
||||
|
||||
connection = await self.get_connection()
|
||||
|
||||
try:
|
||||
collection = await connection.open_table(collection_name)
|
||||
|
||||
collection_size = await collection.count_rows()
|
||||
|
||||
results = (
|
||||
await collection.vector_search(query_vector).limit(collection_size).to_pandas()
|
||||
)
|
||||
|
||||
result_values = list(results.to_dict("index").values())
|
||||
|
||||
normalized_values = normalize_distances(result_values)
|
||||
|
||||
return [
|
||||
ScoredResult(
|
||||
id=parse_id(result["id"]),
|
||||
payload=result["payload"],
|
||||
score=normalized_values[value_index],
|
||||
)
|
||||
for value_index, result in enumerate(result_values)
|
||||
]
|
||||
except ValueError:
|
||||
# Ignore if collection doesn't exist
|
||||
return []
|
||||
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: str = None,
|
||||
query_vector: List[float] = None,
|
||||
limit: int = 5,
|
||||
limit: int = 15,
|
||||
with_vector: bool = False,
|
||||
normalized: bool = True,
|
||||
):
|
||||
|
|
@ -193,12 +160,10 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
if query_text and not query_vector:
|
||||
query_vector = (await self.embedding_engine.embed_text([query_text]))[0]
|
||||
|
||||
connection = await self.get_connection()
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
try:
|
||||
collection = await connection.open_table(collection_name)
|
||||
except ValueError:
|
||||
raise CollectionNotFoundError(f"Collection '{collection_name}' not found!")
|
||||
if limit == 0:
|
||||
limit = await collection.count_rows()
|
||||
|
||||
results = await collection.vector_search(query_vector).limit(limit).to_pandas()
|
||||
|
||||
|
|
@ -242,8 +207,7 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
def delete_data_points(self, collection_name: str, data_point_ids: list[str]):
|
||||
@retry(stop=stop_after_attempt(3), wait=wait_exponential(multiplier=1, min=4, max=10))
|
||||
async def _delete_data_points():
|
||||
connection = await self.get_connection()
|
||||
collection = await connection.open_table(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
# Delete one at a time to avoid commit conflicts
|
||||
for data_point_id in data_point_ids:
|
||||
|
|
@ -288,7 +252,7 @@ class LanceDBAdapter(VectorDBInterface):
|
|||
collection_names = await connection.table_names()
|
||||
|
||||
for collection_name in collection_names:
|
||||
collection = await connection.open_table(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
await collection.delete("id IS NOT NULL")
|
||||
await connection.drop_table(collection_name)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +1,12 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from uuid import UUID
|
||||
from typing import List, Optional
|
||||
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from cognee.infrastructure.engine import DataPoint
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from cognee.infrastructure.databases.vector.exceptions import CollectionNotFoundError
|
||||
|
||||
from ..embeddings.EmbeddingEngine import EmbeddingEngine
|
||||
from ..models.ScoredResult import ScoredResult
|
||||
|
|
@ -96,7 +97,7 @@ class MilvusAdapter(VectorDBInterface):
|
|||
raise e
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: List[DataPoint]):
|
||||
from pymilvus import MilvusException
|
||||
from pymilvus import MilvusException, exceptions
|
||||
|
||||
client = self.get_milvus_client()
|
||||
data_vectors = await self.embed_data(
|
||||
|
|
@ -118,6 +119,10 @@ class MilvusAdapter(VectorDBInterface):
|
|||
f"Inserted {result.get('insert_count', 0)} data points into collection '{collection_name}'."
|
||||
)
|
||||
return result
|
||||
except exceptions.CollectionNotExistException as error:
|
||||
raise CollectionNotFoundError(
|
||||
f"Collection '{collection_name}' does not exist!"
|
||||
) from error
|
||||
except MilvusException as e:
|
||||
logger.error(
|
||||
f"Error inserting data points into collection '{collection_name}': {str(e)}"
|
||||
|
|
@ -140,8 +145,8 @@ class MilvusAdapter(VectorDBInterface):
|
|||
collection_name = f"{index_name}_{index_property_name}"
|
||||
await self.create_data_points(collection_name, formatted_data_points)
|
||||
|
||||
async def retrieve(self, collection_name: str, data_point_ids: list[str]):
|
||||
from pymilvus import MilvusException
|
||||
async def retrieve(self, collection_name: str, data_point_ids: list[UUID]):
|
||||
from pymilvus import MilvusException, exceptions
|
||||
|
||||
client = self.get_milvus_client()
|
||||
try:
|
||||
|
|
@ -153,6 +158,10 @@ class MilvusAdapter(VectorDBInterface):
|
|||
output_fields=["*"],
|
||||
)
|
||||
return results
|
||||
except exceptions.CollectionNotExistException as error:
|
||||
raise CollectionNotFoundError(
|
||||
f"Collection '{collection_name}' does not exist!"
|
||||
) from error
|
||||
except MilvusException as e:
|
||||
logger.error(
|
||||
f"Error retrieving data points from collection '{collection_name}': {str(e)}"
|
||||
|
|
@ -164,10 +173,10 @@ class MilvusAdapter(VectorDBInterface):
|
|||
collection_name: str,
|
||||
query_text: Optional[str] = None,
|
||||
query_vector: Optional[List[float]] = None,
|
||||
limit: int = 5,
|
||||
limit: int = 15,
|
||||
with_vector: bool = False,
|
||||
):
|
||||
from pymilvus import MilvusException
|
||||
from pymilvus import MilvusException, exceptions
|
||||
|
||||
client = self.get_milvus_client()
|
||||
if query_text is None and query_vector is None:
|
||||
|
|
@ -184,7 +193,7 @@ class MilvusAdapter(VectorDBInterface):
|
|||
collection_name=collection_name,
|
||||
data=[query_vector],
|
||||
anns_field="vector",
|
||||
limit=limit,
|
||||
limit=limit if limit > 0 else None,
|
||||
output_fields=output_fields,
|
||||
search_params={
|
||||
"metric_type": "COSINE",
|
||||
|
|
@ -199,6 +208,10 @@ class MilvusAdapter(VectorDBInterface):
|
|||
)
|
||||
for result in results[0]
|
||||
]
|
||||
except exceptions.CollectionNotExistException as error:
|
||||
raise CollectionNotFoundError(
|
||||
f"Collection '{collection_name}' does not exist!"
|
||||
) from error
|
||||
except MilvusException as e:
|
||||
logger.error(f"Error during search in collection '{collection_name}': {str(e)}")
|
||||
raise e
|
||||
|
|
@ -220,7 +233,7 @@ class MilvusAdapter(VectorDBInterface):
|
|||
]
|
||||
)
|
||||
|
||||
async def delete_data_points(self, collection_name: str, data_point_ids: list[str]):
|
||||
async def delete_data_points(self, collection_name: str, data_point_ids: list[UUID]):
|
||||
from pymilvus import MilvusException
|
||||
|
||||
client = self.get_milvus_client()
|
||||
|
|
|
|||
|
|
@ -7,19 +7,18 @@ from sqlalchemy import JSON, Column, Table, select, delete, MetaData
|
|||
from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker
|
||||
|
||||
from cognee.exceptions import InvalidValueError
|
||||
from cognee.infrastructure.databases.exceptions import EntityNotFoundError
|
||||
from cognee.infrastructure.databases.vector.exceptions import CollectionNotFoundError
|
||||
from cognee.infrastructure.engine import DataPoint
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from cognee.infrastructure.databases.relational import get_relational_engine
|
||||
|
||||
from ...relational.ModelBase import Base
|
||||
from ...relational.sqlalchemy.SqlAlchemyAdapter import SQLAlchemyAdapter
|
||||
from ..embeddings.EmbeddingEngine import EmbeddingEngine
|
||||
from ..models.ScoredResult import ScoredResult
|
||||
from ..vector_db_interface import VectorDBInterface
|
||||
from .serialize_data import serialize_data
|
||||
from ..utils import normalize_distances
|
||||
from ..models.ScoredResult import ScoredResult
|
||||
from ..exceptions import CollectionNotFoundError
|
||||
from ..vector_db_interface import VectorDBInterface
|
||||
from ..embeddings.EmbeddingEngine import EmbeddingEngine
|
||||
from .serialize_data import serialize_data
|
||||
|
||||
|
||||
class IndexSchema(DataPoint):
|
||||
|
|
@ -203,60 +202,12 @@ class PGVectorAdapter(SQLAlchemyAdapter, VectorDBInterface):
|
|||
for result in results
|
||||
]
|
||||
|
||||
async def get_distance_from_collection_elements(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: str = None,
|
||||
query_vector: List[float] = None,
|
||||
with_vector: bool = False,
|
||||
) -> List[ScoredResult]:
|
||||
if query_text is None and query_vector is None:
|
||||
raise ValueError("One of query_text or query_vector must be provided!")
|
||||
|
||||
if query_text and not query_vector:
|
||||
query_vector = (await self.embedding_engine.embed_text([query_text]))[0]
|
||||
|
||||
try:
|
||||
# Get PGVectorDataPoint Table from database
|
||||
PGVectorDataPoint = await self.get_table(collection_name)
|
||||
|
||||
# Use async session to connect to the database
|
||||
async with self.get_async_session() as session:
|
||||
# Find closest vectors to query_vector
|
||||
closest_items = await session.execute(
|
||||
select(
|
||||
PGVectorDataPoint,
|
||||
PGVectorDataPoint.c.vector.cosine_distance(query_vector).label(
|
||||
"similarity"
|
||||
),
|
||||
).order_by("similarity")
|
||||
)
|
||||
|
||||
vector_list = []
|
||||
|
||||
# Extract distances and find min/max for normalization
|
||||
for vector in closest_items:
|
||||
# TODO: Add normalization of similarity score
|
||||
vector_list.append(vector)
|
||||
|
||||
# Create and return ScoredResult objects
|
||||
return [
|
||||
ScoredResult(id=parse_id(str(row.id)), payload=row.payload, score=row.similarity)
|
||||
for row in vector_list
|
||||
]
|
||||
except EntityNotFoundError:
|
||||
# Ignore if collection does not exist
|
||||
return []
|
||||
except CollectionNotFoundError:
|
||||
# Ignore if collection does not exist
|
||||
return []
|
||||
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: Optional[str] = None,
|
||||
query_vector: Optional[List[float]] = None,
|
||||
limit: int = 5,
|
||||
limit: int = 15,
|
||||
with_vector: bool = False,
|
||||
) -> List[ScoredResult]:
|
||||
if query_text is None and query_vector is None:
|
||||
|
|
@ -273,20 +224,21 @@ class PGVectorAdapter(SQLAlchemyAdapter, VectorDBInterface):
|
|||
|
||||
# Use async session to connect to the database
|
||||
async with self.get_async_session() as session:
|
||||
query = select(
|
||||
PGVectorDataPoint,
|
||||
PGVectorDataPoint.c.vector.cosine_distance(query_vector).label("similarity"),
|
||||
).order_by("similarity")
|
||||
|
||||
if limit > 0:
|
||||
query = query.limit(limit)
|
||||
|
||||
# Find closest vectors to query_vector
|
||||
closest_items = await session.execute(
|
||||
select(
|
||||
PGVectorDataPoint,
|
||||
PGVectorDataPoint.c.vector.cosine_distance(query_vector).label("similarity"),
|
||||
)
|
||||
.order_by("similarity")
|
||||
.limit(limit)
|
||||
)
|
||||
closest_items = await session.execute(query)
|
||||
|
||||
vector_list = []
|
||||
|
||||
# Extract distances and find min/max for normalization
|
||||
for vector in closest_items:
|
||||
for vector in closest_items.all():
|
||||
vector_list.append(
|
||||
{
|
||||
"id": parse_id(str(vector.id)),
|
||||
|
|
@ -295,6 +247,9 @@ class PGVectorAdapter(SQLAlchemyAdapter, VectorDBInterface):
|
|||
}
|
||||
)
|
||||
|
||||
if len(vector_list) == 0:
|
||||
return []
|
||||
|
||||
# Normalize vector distance and add this as score information to vector_list
|
||||
normalized_values = normalize_distances(vector_list)
|
||||
for i in range(0, len(normalized_values)):
|
||||
|
|
|
|||
|
|
@ -1,12 +1,12 @@
|
|||
from cognee.shared.logging_utils import get_logger
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from qdrant_client import AsyncQdrantClient, models
|
||||
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from cognee.exceptions import InvalidValueError
|
||||
from cognee.infrastructure.databases.vector.models.ScoredResult import ScoredResult
|
||||
from cognee.infrastructure.engine import DataPoint
|
||||
from cognee.infrastructure.databases.vector.exceptions import CollectionNotFoundError
|
||||
from cognee.infrastructure.databases.vector.models.ScoredResult import ScoredResult
|
||||
|
||||
from ..embeddings.EmbeddingEngine import EmbeddingEngine
|
||||
from ..vector_db_interface import VectorDBInterface
|
||||
|
|
@ -97,6 +97,8 @@ class QDrantAdapter(VectorDBInterface):
|
|||
await client.close()
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: List[DataPoint]):
|
||||
from qdrant_client.http.exceptions import UnexpectedResponse
|
||||
|
||||
client = self.get_qdrant_client()
|
||||
|
||||
data_vectors = await self.embed_data(
|
||||
|
|
@ -114,6 +116,13 @@ class QDrantAdapter(VectorDBInterface):
|
|||
|
||||
try:
|
||||
client.upload_points(collection_name=collection_name, points=points)
|
||||
except UnexpectedResponse as error:
|
||||
if "Collection not found" in str(error):
|
||||
raise CollectionNotFoundError(
|
||||
message=f"Collection {collection_name} not found!"
|
||||
) from error
|
||||
else:
|
||||
raise error
|
||||
except Exception as error:
|
||||
logger.error("Error uploading data points to Qdrant: %s", str(error))
|
||||
raise error
|
||||
|
|
@ -143,19 +152,22 @@ class QDrantAdapter(VectorDBInterface):
|
|||
await client.close()
|
||||
return results
|
||||
|
||||
async def get_distance_from_collection_elements(
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: str = None,
|
||||
query_vector: List[float] = None,
|
||||
query_text: Optional[str] = None,
|
||||
query_vector: Optional[List[float]] = None,
|
||||
limit: int = 15,
|
||||
with_vector: bool = False,
|
||||
) -> List[ScoredResult]:
|
||||
if query_text is None and query_vector is None:
|
||||
raise ValueError("One of query_text or query_vector must be provided!")
|
||||
):
|
||||
from qdrant_client.http.exceptions import UnexpectedResponse
|
||||
|
||||
client = self.get_qdrant_client()
|
||||
if query_text is None and query_vector is None:
|
||||
raise InvalidValueError(message="One of query_text or query_vector must be provided!")
|
||||
|
||||
try:
|
||||
client = self.get_qdrant_client()
|
||||
|
||||
results = await client.search(
|
||||
collection_name=collection_name,
|
||||
query_vector=models.NamedVector(
|
||||
|
|
@ -164,9 +176,12 @@ class QDrantAdapter(VectorDBInterface):
|
|||
if query_vector is not None
|
||||
else (await self.embed_data([query_text]))[0],
|
||||
),
|
||||
limit=limit if limit > 0 else None,
|
||||
with_vectors=with_vector,
|
||||
)
|
||||
|
||||
await client.close()
|
||||
|
||||
return [
|
||||
ScoredResult(
|
||||
id=parse_id(result.id),
|
||||
|
|
@ -178,51 +193,16 @@ class QDrantAdapter(VectorDBInterface):
|
|||
)
|
||||
for result in results
|
||||
]
|
||||
except ValueError:
|
||||
# Ignore if the collection doesn't exist
|
||||
return []
|
||||
except UnexpectedResponse as error:
|
||||
if "Collection not found" in str(error):
|
||||
raise CollectionNotFoundError(
|
||||
message=f"Collection {collection_name} not found!"
|
||||
) from error
|
||||
else:
|
||||
raise error
|
||||
finally:
|
||||
await client.close()
|
||||
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: Optional[str] = None,
|
||||
query_vector: Optional[List[float]] = None,
|
||||
limit: int = 5,
|
||||
with_vector: bool = False,
|
||||
):
|
||||
if query_text is None and query_vector is None:
|
||||
raise InvalidValueError(message="One of query_text or query_vector must be provided!")
|
||||
|
||||
client = self.get_qdrant_client()
|
||||
|
||||
results = await client.search(
|
||||
collection_name=collection_name,
|
||||
query_vector=models.NamedVector(
|
||||
name="text",
|
||||
vector=query_vector
|
||||
if query_vector is not None
|
||||
else (await self.embed_data([query_text]))[0],
|
||||
),
|
||||
limit=limit,
|
||||
with_vectors=with_vector,
|
||||
)
|
||||
|
||||
await client.close()
|
||||
|
||||
return [
|
||||
ScoredResult(
|
||||
id=parse_id(result.id),
|
||||
payload={
|
||||
**result.payload,
|
||||
"id": parse_id(result.id),
|
||||
},
|
||||
score=1 - result.score,
|
||||
)
|
||||
for result in results
|
||||
]
|
||||
|
||||
async def batch_search(
|
||||
self,
|
||||
collection_name: str,
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import asyncio
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from typing import List, Optional
|
||||
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from cognee.exceptions import InvalidValueError
|
||||
from cognee.infrastructure.engine import DataPoint
|
||||
from cognee.infrastructure.engine.utils import parse_id
|
||||
from cognee.infrastructure.databases.vector.exceptions import CollectionNotFoundError
|
||||
|
||||
from ..embeddings.EmbeddingEngine import EmbeddingEngine
|
||||
from ..models.ScoredResult import ScoredResult
|
||||
|
|
@ -34,21 +34,23 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
|
||||
self.embedding_engine = embedding_engine
|
||||
|
||||
self.client = weaviate.connect_to_wcs(
|
||||
self.client = weaviate.use_async_with_weaviate_cloud(
|
||||
cluster_url=url,
|
||||
auth_credentials=weaviate.auth.AuthApiKey(api_key),
|
||||
additional_config=wvc.init.AdditionalConfig(timeout=wvc.init.Timeout(init=30)),
|
||||
)
|
||||
|
||||
async def get_client(self):
|
||||
await self.client.connect()
|
||||
|
||||
return self.client
|
||||
|
||||
async def embed_data(self, data: List[str]) -> List[float]:
|
||||
return await self.embedding_engine.embed_text(data)
|
||||
|
||||
async def has_collection(self, collection_name: str) -> bool:
|
||||
future = asyncio.Future()
|
||||
|
||||
future.set_result(self.client.collections.exists(collection_name))
|
||||
|
||||
return await future
|
||||
client = await self.get_client()
|
||||
return await client.collections.exists(collection_name)
|
||||
|
||||
async def create_collection(
|
||||
self,
|
||||
|
|
@ -57,26 +59,25 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
):
|
||||
import weaviate.classes.config as wvcc
|
||||
|
||||
future = asyncio.Future()
|
||||
|
||||
if not self.client.collections.exists(collection_name):
|
||||
future.set_result(
|
||||
self.client.collections.create(
|
||||
name=collection_name,
|
||||
properties=[
|
||||
wvcc.Property(
|
||||
name="text", data_type=wvcc.DataType.TEXT, skip_vectorization=True
|
||||
)
|
||||
],
|
||||
)
|
||||
if not await self.has_collection(collection_name):
|
||||
client = await self.get_client()
|
||||
return await client.collections.create(
|
||||
name=collection_name,
|
||||
properties=[
|
||||
wvcc.Property(
|
||||
name="text", data_type=wvcc.DataType.TEXT, skip_vectorization=True
|
||||
)
|
||||
],
|
||||
)
|
||||
else:
|
||||
future.set_result(self.get_collection(collection_name))
|
||||
return await self.get_collection(collection_name)
|
||||
|
||||
return await future
|
||||
async def get_collection(self, collection_name: str):
|
||||
if not await self.has_collection(collection_name):
|
||||
raise CollectionNotFoundError(f"Collection '{collection_name}' not found.")
|
||||
|
||||
def get_collection(self, collection_name: str):
|
||||
return self.client.collections.get(collection_name)
|
||||
client = await self.get_client()
|
||||
return client.collections.get(collection_name)
|
||||
|
||||
async def create_data_points(self, collection_name: str, data_points: List[DataPoint]):
|
||||
from weaviate.classes.data import DataObject
|
||||
|
|
@ -97,29 +98,30 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
|
||||
data_points = [convert_to_weaviate_data_points(data_point) for data_point in data_points]
|
||||
|
||||
collection = self.get_collection(collection_name)
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
try:
|
||||
if len(data_points) > 1:
|
||||
with collection.batch.dynamic() as batch:
|
||||
for data_point in data_points:
|
||||
batch.add_object(
|
||||
uuid=data_point.uuid,
|
||||
vector=data_point.vector,
|
||||
properties=data_point.properties,
|
||||
references=data_point.references,
|
||||
)
|
||||
return await collection.data.insert_many(data_points)
|
||||
# with collection.batch.dynamic() as batch:
|
||||
# for data_point in data_points:
|
||||
# batch.add_object(
|
||||
# uuid=data_point.uuid,
|
||||
# vector=data_point.vector,
|
||||
# properties=data_point.properties,
|
||||
# references=data_point.references,
|
||||
# )
|
||||
else:
|
||||
data_point: DataObject = data_points[0]
|
||||
if collection.data.exists(data_point.uuid):
|
||||
return collection.data.update(
|
||||
return await collection.data.update(
|
||||
uuid=data_point.uuid,
|
||||
vector=data_point.vector,
|
||||
properties=data_point.properties,
|
||||
references=data_point.references,
|
||||
)
|
||||
else:
|
||||
return collection.data.insert(
|
||||
return await collection.data.insert(
|
||||
uuid=data_point.uuid,
|
||||
vector=data_point.vector,
|
||||
properties=data_point.properties,
|
||||
|
|
@ -130,12 +132,12 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
raise error
|
||||
|
||||
async def create_vector_index(self, index_name: str, index_property_name: str):
|
||||
await self.create_collection(f"{index_name}_{index_property_name}")
|
||||
return await self.create_collection(f"{index_name}_{index_property_name}")
|
||||
|
||||
async def index_data_points(
|
||||
self, index_name: str, index_property_name: str, data_points: list[DataPoint]
|
||||
):
|
||||
await self.create_data_points(
|
||||
return await self.create_data_points(
|
||||
f"{index_name}_{index_property_name}",
|
||||
[
|
||||
IndexSchema(
|
||||
|
|
@ -149,9 +151,8 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
async def retrieve(self, collection_name: str, data_point_ids: list[str]):
|
||||
from weaviate.classes.query import Filter
|
||||
|
||||
future = asyncio.Future()
|
||||
|
||||
data_points = self.get_collection(collection_name).query.fetch_objects(
|
||||
collection = await self.get_collection(collection_name)
|
||||
data_points = await collection.query.fetch_objects(
|
||||
filters=Filter.by_id().contains_any(data_point_ids)
|
||||
)
|
||||
|
||||
|
|
@ -160,30 +161,32 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
data_point.id = data_point.uuid
|
||||
del data_point.properties
|
||||
|
||||
future.set_result(data_points.objects)
|
||||
return data_points.objects
|
||||
|
||||
return await future
|
||||
|
||||
async def get_distance_from_collection_elements(
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: str = None,
|
||||
query_vector: List[float] = None,
|
||||
query_text: Optional[str] = None,
|
||||
query_vector: Optional[List[float]] = None,
|
||||
limit: int = 15,
|
||||
with_vector: bool = False,
|
||||
) -> List[ScoredResult]:
|
||||
):
|
||||
import weaviate.classes as wvc
|
||||
import weaviate.exceptions
|
||||
|
||||
if query_text is None and query_vector is None:
|
||||
raise ValueError("One of query_text or query_vector must be provided!")
|
||||
raise InvalidValueError(message="One of query_text or query_vector must be provided!")
|
||||
|
||||
if query_vector is None:
|
||||
query_vector = (await self.embed_data([query_text]))[0]
|
||||
|
||||
collection = await self.get_collection(collection_name)
|
||||
|
||||
try:
|
||||
search_result = self.get_collection(collection_name).query.hybrid(
|
||||
search_result = await collection.query.hybrid(
|
||||
query=None,
|
||||
vector=query_vector,
|
||||
limit=limit if limit > 0 else None,
|
||||
include_vector=with_vector,
|
||||
return_metadata=wvc.query.MetadataQuery(score=True),
|
||||
)
|
||||
|
|
@ -196,43 +199,10 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
)
|
||||
for result in search_result.objects
|
||||
]
|
||||
except weaviate.exceptions.UnexpectedStatusCodeError:
|
||||
except weaviate.exceptions.WeaviateInvalidInputError:
|
||||
# Ignore if the collection doesn't exist
|
||||
return []
|
||||
|
||||
async def search(
|
||||
self,
|
||||
collection_name: str,
|
||||
query_text: Optional[str] = None,
|
||||
query_vector: Optional[List[float]] = None,
|
||||
limit: int = None,
|
||||
with_vector: bool = False,
|
||||
):
|
||||
import weaviate.classes as wvc
|
||||
|
||||
if query_text is None and query_vector is None:
|
||||
raise InvalidValueError(message="One of query_text or query_vector must be provided!")
|
||||
|
||||
if query_vector is None:
|
||||
query_vector = (await self.embed_data([query_text]))[0]
|
||||
|
||||
search_result = self.get_collection(collection_name).query.hybrid(
|
||||
query=None,
|
||||
vector=query_vector,
|
||||
limit=limit,
|
||||
include_vector=with_vector,
|
||||
return_metadata=wvc.query.MetadataQuery(score=True),
|
||||
)
|
||||
|
||||
return [
|
||||
ScoredResult(
|
||||
id=parse_id(str(result.uuid)),
|
||||
payload=result.properties,
|
||||
score=1 - float(result.metadata.score),
|
||||
)
|
||||
for result in search_result.objects
|
||||
]
|
||||
|
||||
async def batch_search(
|
||||
self, collection_name: str, query_texts: List[str], limit: int, with_vectors: bool = False
|
||||
):
|
||||
|
|
@ -248,14 +218,13 @@ class WeaviateAdapter(VectorDBInterface):
|
|||
async def delete_data_points(self, collection_name: str, data_point_ids: list[str]):
|
||||
from weaviate.classes.query import Filter
|
||||
|
||||
future = asyncio.Future()
|
||||
|
||||
result = self.get_collection(collection_name).data.delete_many(
|
||||
collection = await self.get_collection(collection_name)
|
||||
result = await collection.data.delete_many(
|
||||
filters=Filter.by_id().contains_any(data_point_ids)
|
||||
)
|
||||
future.set_result(result)
|
||||
|
||||
return await future
|
||||
return result
|
||||
|
||||
async def prune(self):
|
||||
self.client.collections.delete_all()
|
||||
client = await self.get_client()
|
||||
await client.collections.delete_all()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
from typing import Type, Optional
|
||||
from pydantic import BaseModel
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
import litellm
|
||||
from pydantic import BaseModel
|
||||
from typing import Type, Optional
|
||||
from litellm import acompletion, JSONSchemaValidationError
|
||||
from cognee.shared.data_models import MonitoringTool
|
||||
|
||||
from cognee.shared.logging_utils import get_logger
|
||||
from cognee.modules.observability.get_observe import get_observe
|
||||
from cognee.exceptions import InvalidValueError
|
||||
from cognee.infrastructure.llm.llm_interface import LLMInterface
|
||||
from cognee.infrastructure.llm.prompts import read_query_prompt
|
||||
|
|
@ -11,14 +12,9 @@ from cognee.infrastructure.llm.rate_limiter import (
|
|||
rate_limit_async,
|
||||
sleep_and_retry_async,
|
||||
)
|
||||
from cognee.base_config import get_base_config
|
||||
|
||||
logger = get_logger()
|
||||
|
||||
monitoring = get_base_config().monitoring_tool
|
||||
|
||||
if monitoring == MonitoringTool.LANGFUSE:
|
||||
from langfuse.decorators import observe
|
||||
observe = get_observe()
|
||||
|
||||
|
||||
class GeminiAdapter(LLMInterface):
|
||||
|
|
|
|||
|
|
@ -1,14 +1,11 @@
|
|||
import os
|
||||
import base64
|
||||
from pathlib import Path
|
||||
from typing import Type
|
||||
|
||||
import litellm
|
||||
import instructor
|
||||
from typing import Type
|
||||
from pydantic import BaseModel
|
||||
|
||||
from cognee.modules.data.processing.document_types.open_data_file import open_data_file
|
||||
from cognee.shared.data_models import MonitoringTool
|
||||
from cognee.exceptions import InvalidValueError
|
||||
from cognee.infrastructure.llm.llm_interface import LLMInterface
|
||||
from cognee.infrastructure.llm.prompts import read_query_prompt
|
||||
|
|
@ -18,12 +15,9 @@ from cognee.infrastructure.llm.rate_limiter import (
|
|||
sleep_and_retry_async,
|
||||
sleep_and_retry_sync,
|
||||
)
|
||||
from cognee.base_config import get_base_config
|
||||
from cognee.modules.observability.get_observe import get_observe
|
||||
|
||||
monitoring = get_base_config().monitoring_tool
|
||||
|
||||
if monitoring == MonitoringTool.LANGFUSE:
|
||||
from langfuse.decorators import observe
|
||||
observe = get_observe()
|
||||
|
||||
|
||||
class OpenAIAdapter(LLMInterface):
|
||||
|
|
|
|||
|
|
@ -133,8 +133,10 @@ class CogneeGraph(CogneeAbstractGraph):
|
|||
if query_vector is None or len(query_vector) == 0:
|
||||
raise ValueError("Failed to generate query embedding.")
|
||||
|
||||
edge_distances = await vector_engine.get_distance_from_collection_elements(
|
||||
"EdgeType_relationship_name", query_text=query
|
||||
edge_distances = await vector_engine.search(
|
||||
collection_name="EdgeType_relationship_name",
|
||||
query_text=query,
|
||||
limit=0,
|
||||
)
|
||||
|
||||
embedding_map = {result.payload["text"]: result.score for result in edge_distances}
|
||||
|
|
|
|||
11
cognee/modules/observability/get_observe.py
Normal file
11
cognee/modules/observability/get_observe.py
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
from cognee.base_config import get_base_config
|
||||
from .observers import Observer
|
||||
|
||||
|
||||
def get_observe():
|
||||
monitoring = get_base_config().monitoring_tool
|
||||
|
||||
if monitoring == Observer.LANGFUSE:
|
||||
from langfuse.decorators import observe
|
||||
|
||||
return observe
|
||||
9
cognee/modules/observability/observers.py
Normal file
9
cognee/modules/observability/observers.py
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
from enum import Enum
|
||||
|
||||
|
||||
class Observer(str, Enum):
|
||||
"""Monitoring tools"""
|
||||
|
||||
LANGFUSE = "langfuse"
|
||||
LLMLITE = "llmlite"
|
||||
LANGSMITH = "langsmith"
|
||||
|
|
@ -20,7 +20,9 @@ from ..tasks.task import Task
|
|||
logger = get_logger("run_tasks(tasks: [Task], data)")
|
||||
|
||||
|
||||
async def run_tasks_with_telemetry(tasks: list[Task], data, user: User, pipeline_name: str):
|
||||
async def run_tasks_with_telemetry(
|
||||
tasks: list[Task], data, user: User, pipeline_name: str, context: dict = None
|
||||
):
|
||||
config = get_current_settings()
|
||||
|
||||
logger.debug("\nRunning pipeline with configuration:\n%s\n", json.dumps(config, indent=1))
|
||||
|
|
@ -36,7 +38,7 @@ async def run_tasks_with_telemetry(tasks: list[Task], data, user: User, pipeline
|
|||
| config,
|
||||
)
|
||||
|
||||
async for result in run_tasks_base(tasks, data, user):
|
||||
async for result in run_tasks_base(tasks, data, user, context):
|
||||
yield result
|
||||
|
||||
logger.info("Pipeline run completed: `%s`", pipeline_name)
|
||||
|
|
@ -72,6 +74,7 @@ async def run_tasks(
|
|||
data: Any = None,
|
||||
user: User = None,
|
||||
pipeline_name: str = "unknown_pipeline",
|
||||
context: dict = None,
|
||||
):
|
||||
pipeline_id = uuid5(NAMESPACE_OID, pipeline_name)
|
||||
|
||||
|
|
@ -82,7 +85,11 @@ async def run_tasks(
|
|||
|
||||
try:
|
||||
async for _ in run_tasks_with_telemetry(
|
||||
tasks=tasks, data=data, user=user, pipeline_name=pipeline_id
|
||||
tasks=tasks,
|
||||
data=data,
|
||||
user=user,
|
||||
pipeline_name=pipeline_id,
|
||||
context=context,
|
||||
):
|
||||
pass
|
||||
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ async def handle_task(
|
|||
leftover_tasks: list[Task],
|
||||
next_task_batch_size: int,
|
||||
user: User,
|
||||
context: dict = None,
|
||||
):
|
||||
"""Handle common task workflow with logging, telemetry, and error handling around the core execution logic."""
|
||||
task_type = running_task.task_type
|
||||
|
|
@ -27,9 +28,16 @@ async def handle_task(
|
|||
},
|
||||
)
|
||||
|
||||
has_context = any(
|
||||
[key == "context" for key in inspect.signature(running_task.executable).parameters.keys()]
|
||||
)
|
||||
|
||||
if has_context:
|
||||
args.append(context)
|
||||
|
||||
try:
|
||||
async for result_data in running_task.execute(args, next_task_batch_size):
|
||||
async for result in run_tasks_base(leftover_tasks, result_data, user):
|
||||
async for result in run_tasks_base(leftover_tasks, result_data, user, context):
|
||||
yield result
|
||||
|
||||
logger.info(f"{task_type} task completed: `{running_task.executable.__name__}`")
|
||||
|
|
@ -55,7 +63,7 @@ async def handle_task(
|
|||
raise error
|
||||
|
||||
|
||||
async def run_tasks_base(tasks: list[Task], data=None, user: User = None):
|
||||
async def run_tasks_base(tasks: list[Task], data=None, user: User = None, context: dict = None):
|
||||
"""Base function to execute tasks in a pipeline, handling task type detection and execution."""
|
||||
if len(tasks) == 0:
|
||||
yield data
|
||||
|
|
@ -68,5 +76,7 @@ async def run_tasks_base(tasks: list[Task], data=None, user: User = None):
|
|||
next_task = leftover_tasks[0] if len(leftover_tasks) > 0 else None
|
||||
next_task_batch_size = next_task.task_config["batch_size"] if next_task else 1
|
||||
|
||||
async for result in handle_task(running_task, args, leftover_tasks, next_task_batch_size, user):
|
||||
async for result in handle_task(
|
||||
running_task, args, leftover_tasks, next_task_batch_size, user, context
|
||||
):
|
||||
yield result
|
||||
|
|
|
|||
|
|
@ -2,16 +2,6 @@ from fastapi import status
|
|||
from cognee.exceptions import CogneeApiError, CriticalError
|
||||
|
||||
|
||||
class CollectionDistancesNotFoundError(CogneeApiError):
|
||||
def __init__(
|
||||
self,
|
||||
message: str = "No distances found between the query and collections. It is possible that the given collection names don't exist.",
|
||||
name: str = "CollectionDistancesNotFoundError",
|
||||
status_code: int = status.HTTP_404_NOT_FOUND,
|
||||
):
|
||||
super().__init__(message, name, status_code)
|
||||
|
||||
|
||||
class SearchTypeNotSupported(CogneeApiError):
|
||||
def __init__(
|
||||
self,
|
||||
|
|
@ -34,3 +24,13 @@ class CypherSearchError(CogneeApiError):
|
|||
|
||||
class NoDataError(CriticalError):
|
||||
message: str = "No data found in the system, please add data first."
|
||||
|
||||
|
||||
class CollectionDistancesNotFoundError(CogneeApiError):
|
||||
def __init__(
|
||||
self,
|
||||
message: str = "No collection distances found for the given query.",
|
||||
name: str = "CollectionDistancesNotFoundError",
|
||||
status_code: int = status.HTTP_404_NOT_FOUND,
|
||||
):
|
||||
super().__init__(message, name, status_code)
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ from collections import Counter
|
|||
import string
|
||||
|
||||
from cognee.infrastructure.engine import DataPoint
|
||||
from cognee.modules.graph.exceptions.exceptions import EntityNotFoundError
|
||||
from cognee.modules.graph.utils.convert_node_to_data_point import get_all_subclasses
|
||||
from cognee.modules.retrieval.base_retriever import BaseRetriever
|
||||
from cognee.modules.retrieval.utils.brute_force_triplet_search import brute_force_triplet_search
|
||||
|
|
@ -87,10 +86,7 @@ class GraphCompletionRetriever(BaseRetriever):
|
|||
|
||||
async def get_context(self, query: str) -> str:
|
||||
"""Retrieves and resolves graph triplets into context."""
|
||||
try:
|
||||
triplets = await self.get_triplets(query)
|
||||
except EntityNotFoundError:
|
||||
return ""
|
||||
triplets = await self.get_triplets(query)
|
||||
|
||||
if len(triplets) == 0:
|
||||
logger.warning("Empty context was provided to the completion")
|
||||
|
|
|
|||
|
|
@ -350,11 +350,3 @@ class ChunkSummaries(BaseModel):
|
|||
"""Relevant summary and chunk id"""
|
||||
|
||||
summaries: List[ChunkSummary]
|
||||
|
||||
|
||||
class MonitoringTool(str, Enum):
|
||||
"""Monitoring tools"""
|
||||
|
||||
LANGFUSE = "langfuse"
|
||||
LLMLITE = "llmlite"
|
||||
LANGSMITH = "langsmith"
|
||||
|
|
|
|||
|
|
@ -312,7 +312,7 @@ def setup_logging(log_level=None, name=None):
|
|||
root_logger.addHandler(file_handler)
|
||||
root_logger.setLevel(log_level)
|
||||
|
||||
if log_level > logging.WARNING:
|
||||
if log_level > logging.DEBUG:
|
||||
import warnings
|
||||
from sqlalchemy.exc import SAWarning
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +0,0 @@
|
|||
import os
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True, scope="session")
|
||||
def copy_cognee_db_to_target_location():
|
||||
os.makedirs("cognee/.cognee_system/databases/", exist_ok=True)
|
||||
os.system(
|
||||
"cp cognee/tests/integration/run_toy_tasks/data/cognee_db cognee/.cognee_system/databases/cognee_db"
|
||||
)
|
||||
Binary file not shown.
|
|
@ -82,7 +82,7 @@ async def main():
|
|||
assert not os.path.isdir(data_directory_path), "Local data files are not deleted"
|
||||
|
||||
await cognee.prune.prune_system(metadata=True)
|
||||
collections = get_vector_engine().client.collections.list_all()
|
||||
collections = await get_vector_engine().client.collections.list_all()
|
||||
assert len(collections) == 0, "Weaviate vector database is not empty"
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -48,3 +48,7 @@ async def run_and_check_tasks():
|
|||
|
||||
def test_run_tasks():
|
||||
asyncio.run(run_and_check_tasks())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_run_tasks()
|
||||
|
|
@ -0,0 +1,47 @@
|
|||
import asyncio
|
||||
|
||||
import cognee
|
||||
from cognee.modules.pipelines.tasks.task import Task
|
||||
from cognee.modules.users.methods import get_default_user
|
||||
from cognee.modules.pipelines.operations.run_tasks import run_tasks_base
|
||||
from cognee.infrastructure.databases.relational import create_db_and_tables
|
||||
|
||||
|
||||
async def run_and_check_tasks():
|
||||
await cognee.prune.prune_data()
|
||||
await cognee.prune.prune_system(metadata=True)
|
||||
|
||||
def task_1(num, context):
|
||||
return num + context
|
||||
|
||||
def task_2(num):
|
||||
return num * 2
|
||||
|
||||
def task_3(num, context):
|
||||
return num**context
|
||||
|
||||
await create_db_and_tables()
|
||||
user = await get_default_user()
|
||||
|
||||
pipeline = run_tasks_base(
|
||||
[
|
||||
Task(task_1),
|
||||
Task(task_2),
|
||||
Task(task_3),
|
||||
],
|
||||
data=5,
|
||||
user=user,
|
||||
context=7,
|
||||
)
|
||||
|
||||
final_result = 4586471424
|
||||
async for result in pipeline:
|
||||
assert result == final_result
|
||||
|
||||
|
||||
def test_run_tasks():
|
||||
asyncio.run(run_and_check_tasks())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
test_run_tasks()
|
||||
|
|
@ -16,11 +16,11 @@ class TestChunksRetriever:
|
|||
@pytest.mark.asyncio
|
||||
async def test_chunk_context_simple(self):
|
||||
system_directory_path = os.path.join(
|
||||
pathlib.Path(__file__).parent, ".cognee_system/test_rag_context"
|
||||
pathlib.Path(__file__).parent, ".cognee_system/test_chunks_context_simple"
|
||||
)
|
||||
cognee.config.system_root_directory(system_directory_path)
|
||||
data_directory_path = os.path.join(
|
||||
pathlib.Path(__file__).parent, ".data_storage/test_rag_context"
|
||||
pathlib.Path(__file__).parent, ".data_storage/test_chunks_context_simple"
|
||||
)
|
||||
cognee.config.data_root_directory(data_directory_path)
|
||||
|
||||
|
|
@ -73,11 +73,11 @@ class TestChunksRetriever:
|
|||
@pytest.mark.asyncio
|
||||
async def test_chunk_context_complex(self):
|
||||
system_directory_path = os.path.join(
|
||||
pathlib.Path(__file__).parent, ".cognee_system/test_chunk_context"
|
||||
pathlib.Path(__file__).parent, ".cognee_system/test_chunk_context_complex"
|
||||
)
|
||||
cognee.config.system_root_directory(system_directory_path)
|
||||
data_directory_path = os.path.join(
|
||||
pathlib.Path(__file__).parent, ".data_storage/test_chunk_context"
|
||||
pathlib.Path(__file__).parent, ".data_storage/test_chunk_context_complex"
|
||||
)
|
||||
cognee.config.data_root_directory(data_directory_path)
|
||||
|
||||
|
|
@ -162,11 +162,11 @@ class TestChunksRetriever:
|
|||
@pytest.mark.asyncio
|
||||
async def test_chunk_context_on_empty_graph(self):
|
||||
system_directory_path = os.path.join(
|
||||
pathlib.Path(__file__).parent, ".cognee_system/test_chunk_context"
|
||||
pathlib.Path(__file__).parent, ".cognee_system/test_chunk_context_empty"
|
||||
)
|
||||
cognee.config.system_root_directory(system_directory_path)
|
||||
data_directory_path = os.path.join(
|
||||
pathlib.Path(__file__).parent, ".data_storage/test_chunk_context"
|
||||
pathlib.Path(__file__).parent, ".data_storage/test_chunk_context_empty"
|
||||
)
|
||||
cognee.config.data_root_directory(data_directory_path)
|
||||
|
||||
|
|
@ -190,6 +190,9 @@ if __name__ == "__main__":
|
|||
|
||||
test = TestChunksRetriever()
|
||||
|
||||
run(test.test_chunk_context_simple())
|
||||
run(test.test_chunk_context_complex())
|
||||
run(test.test_chunk_context_on_empty_graph())
|
||||
async def main():
|
||||
await test.test_chunk_context_simple()
|
||||
await test.test_chunk_context_complex()
|
||||
await test.test_chunk_context_on_empty_graph()
|
||||
|
||||
run(main())
|
||||
|
|
|
|||
|
|
@ -154,6 +154,9 @@ if __name__ == "__main__":
|
|||
|
||||
test = TestGraphCompletionRetriever()
|
||||
|
||||
run(test.test_graph_completion_context_simple())
|
||||
run(test.test_graph_completion_context_complex())
|
||||
run(test.test_get_graph_completion_context_on_empty_graph())
|
||||
async def main():
|
||||
await test.test_graph_completion_context_simple()
|
||||
await test.test_graph_completion_context_complex()
|
||||
await test.test_get_graph_completion_context_on_empty_graph()
|
||||
|
||||
run(main())
|
||||
|
|
|
|||
|
|
@ -127,7 +127,7 @@ class TextSummariesRetriever:
|
|||
|
||||
await add_data_points(entities)
|
||||
|
||||
retriever = SummariesRetriever(limit=20)
|
||||
retriever = SummariesRetriever(top_k=20)
|
||||
|
||||
context = await retriever.get_context("Christina")
|
||||
|
||||
|
|
|
|||
|
|
@ -1,44 +0,0 @@
|
|||
import pytest
|
||||
from unittest.mock import AsyncMock, patch
|
||||
from cognee.modules.users.models import User
|
||||
from cognee.modules.retrieval.exceptions import CollectionDistancesNotFoundError
|
||||
from cognee.modules.retrieval.utils.brute_force_triplet_search import (
|
||||
brute_force_search,
|
||||
brute_force_triplet_search,
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
@patch("cognee.modules.retrieval.utils.brute_force_triplet_search.get_vector_engine")
|
||||
async def test_brute_force_search_collection_not_found(mock_get_vector_engine):
|
||||
user = User(id="test_user")
|
||||
query = "test query"
|
||||
collections = ["nonexistent_collection"]
|
||||
top_k = 5
|
||||
mock_memory_fragment = AsyncMock()
|
||||
mock_vector_engine = AsyncMock()
|
||||
mock_vector_engine.get_distance_from_collection_elements.return_value = []
|
||||
mock_get_vector_engine.return_value = mock_vector_engine
|
||||
|
||||
with pytest.raises(CollectionDistancesNotFoundError):
|
||||
await brute_force_search(
|
||||
query, user, top_k, collections=collections, memory_fragment=mock_memory_fragment
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
@patch("cognee.modules.retrieval.utils.brute_force_triplet_search.get_vector_engine")
|
||||
async def test_brute_force_triplet_search_collection_not_found(mock_get_vector_engine):
|
||||
user = User(id="test_user")
|
||||
query = "test query"
|
||||
collections = ["nonexistent_collection"]
|
||||
top_k = 5
|
||||
mock_memory_fragment = AsyncMock()
|
||||
mock_vector_engine = AsyncMock()
|
||||
mock_vector_engine.get_distance_from_collection_elements.return_value = []
|
||||
mock_get_vector_engine.return_value = mock_vector_engine
|
||||
|
||||
with pytest.raises(CollectionDistancesNotFoundError):
|
||||
await brute_force_triplet_search(
|
||||
query, user, top_k, collections=collections, memory_fragment=mock_memory_fragment
|
||||
)
|
||||
162
community/README.zh.md
Normal file
162
community/README.zh.md
Normal file
|
|
@ -0,0 +1,162 @@
|
|||
<div align="center">
|
||||
<a href="https://github.com/topoteretes/cognee">
|
||||
<img src="https://raw.githubusercontent.com/topoteretes/cognee/refs/heads/dev/assets/cognee-logo-transparent.png" alt="Cognee Logo" height="60">
|
||||
</a>
|
||||
|
||||
<br />
|
||||
|
||||
cognee - AI应用和智能体的记忆层
|
||||
|
||||
<p align="center">
|
||||
<a href="https://www.youtube.com/watch?v=1bezuvLwJmw&t=2s">演示</a>
|
||||
.
|
||||
<a href="https://cognee.ai">了解更多</a>
|
||||
·
|
||||
<a href="https://discord.gg/NQPKmU5CCg">加入Discord</a>
|
||||
</p>
|
||||
|
||||
|
||||
[](https://GitHub.com/topoteretes/cognee/network/)
|
||||
[](https://GitHub.com/topoteretes/cognee/stargazers/)
|
||||
[](https://GitHub.com/topoteretes/cognee/commit/)
|
||||
[](https://github.com/topoteretes/cognee/tags/)
|
||||
[](https://pepy.tech/project/cognee)
|
||||
[](https://github.com/topoteretes/cognee/blob/main/LICENSE)
|
||||
[](https://github.com/topoteretes/cognee/graphs/contributors)
|
||||
|
||||
可靠的AI智能体响应。
|
||||
|
||||
|
||||
|
||||
使用可扩展、模块化的ECL(提取、认知、加载)管道构建动态智能体记忆。
|
||||
|
||||
更多[使用场景](https://docs.cognee.ai/use_cases)。
|
||||
|
||||
<div style="text-align: center">
|
||||
<img src="cognee_benefits_zh.JPG" alt="为什么选择cognee?" width="100%" />
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
|
||||
|
||||
|
||||
## 功能特性
|
||||
|
||||
- 互联并检索您的历史对话、文档、图像和音频转录
|
||||
- 减少幻觉、开发人员工作量和成本
|
||||
- 仅使用Pydantic将数据加载到图形和向量数据库
|
||||
- 从30多个数据源摄取数据时进行数据操作
|
||||
|
||||
## 开始使用
|
||||
|
||||
通过Google Colab <a href="https://colab.research.google.com/drive/1g-Qnx6l_ecHZi0IOw23rg0qC4TYvEvWZ?usp=sharing">笔记本</a>或<a href="https://github.com/topoteretes/cognee-starter">入门项目</a>快速上手
|
||||
|
||||
## 贡献
|
||||
您的贡献是使这成为真正开源项目的核心。我们**非常感谢**任何贡献。更多信息请参阅[`CONTRIBUTING.md`](CONTRIBUTING.md)。
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## 📦 安装
|
||||
|
||||
您可以使用**pip**、**poetry**、**uv**或任何其他Python包管理器安装Cognee。
|
||||
|
||||
### 使用pip
|
||||
|
||||
```bash
|
||||
pip install cognee
|
||||
```
|
||||
|
||||
## 💻 基本用法
|
||||
|
||||
### 设置
|
||||
|
||||
```
|
||||
import os
|
||||
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
|
||||
|
||||
```
|
||||
|
||||
您也可以通过创建.env文件设置变量,使用我们的<a href="https://github.com/topoteretes/cognee/blob/main/.env.template">模板</a>。
|
||||
要使用不同的LLM提供商,请查看我们的<a href="https://docs.cognee.ai">文档</a>获取更多信息。
|
||||
|
||||
|
||||
### 简单示例
|
||||
|
||||
此脚本将运行默认管道:
|
||||
|
||||
```python
|
||||
import cognee
|
||||
import asyncio
|
||||
|
||||
|
||||
async def main():
|
||||
# Add text to cognee
|
||||
await cognee.add("自然语言处理(NLP)是计算机科学和信息检索的跨学科领域。")
|
||||
|
||||
# Generate the knowledge graph
|
||||
await cognee.cognify()
|
||||
|
||||
# Query the knowledge graph
|
||||
results = await cognee.search("告诉我关于NLP")
|
||||
|
||||
# Display the results
|
||||
for result in results:
|
||||
print(result)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
asyncio.run(main())
|
||||
|
||||
```
|
||||
示例输出:
|
||||
```
|
||||
自然语言处理(NLP)是计算机科学和信息检索的跨学科领域。它关注计算机和人类语言之间的交互,使机器能够理解和处理自然语言。
|
||||
|
||||
```
|
||||
图形可视化:
|
||||
<a href="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.html"><img src="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.png" width="100%" alt="图形可视化"></a>
|
||||
在[浏览器](https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.html)中打开。
|
||||
|
||||
有关更高级的用法,请查看我们的<a href="https://docs.cognee.ai">文档</a>。
|
||||
|
||||
|
||||
## 了解我们的架构
|
||||
|
||||
<div style="text-align: center">
|
||||
<img src="cognee_diagram_zh.JPG" alt="cognee概念图" width="100%" />
|
||||
</div>
|
||||
|
||||
|
||||
|
||||
## 演示
|
||||
|
||||
1. 什么是AI记忆:
|
||||
|
||||
[了解cognee](https://github.com/user-attachments/assets/8b2a0050-5ec4-424c-b417-8269971503f0)
|
||||
|
||||
2. 简单GraphRAG演示
|
||||
|
||||
[简单GraphRAG演示](https://github.com/user-attachments/assets/f57fd9ea-1dc0-4904-86eb-de78519fdc32)
|
||||
|
||||
3. cognee与Ollama
|
||||
|
||||
[cognee与本地模型](https://github.com/user-attachments/assets/834baf9a-c371-4ecf-92dd-e144bd0eb3f6)
|
||||
|
||||
|
||||
## 行为准则
|
||||
|
||||
我们致力于为我们的社区提供愉快和尊重的开源体验。有关更多信息,请参阅<a href="https://github.com/topoteretes/cognee/blob/main/CODE_OF_CONDUCT.md"><code>CODE_OF_CONDUCT</code></a>。
|
||||
|
||||
## 💫 贡献者
|
||||
|
||||
<a href="https://github.com/topoteretes/cognee/graphs/contributors">
|
||||
<img alt="contributors" src="https://contrib.rocks/image?repo=topoteretes/cognee"/>
|
||||
</a>
|
||||
|
||||
|
||||
## Star历史
|
||||
|
||||
[](https://star-history.com/#topoteretes/cognee&Date)
|
||||
BIN
community/cognee_benefits_zh.JPG
Normal file
BIN
community/cognee_benefits_zh.JPG
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 262 KiB |
BIN
community/cognee_diagram_zh.JPG
Normal file
BIN
community/cognee_diagram_zh.JPG
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 181 KiB |
37
examples/data/car_and_tech_companies.txt
Normal file
37
examples/data/car_and_tech_companies.txt
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
text_1 = """
|
||||
1. Audi
|
||||
Audi is known for its modern designs and advanced technology. Founded in the early 1900s, the brand has earned a reputation for precision engineering and innovation. With features like the Quattro all-wheel-drive system, Audi offers a range of vehicles from stylish sedans to high-performance sports cars.
|
||||
|
||||
2. BMW
|
||||
BMW, short for Bayerische Motoren Werke, is celebrated for its focus on performance and driving pleasure. The company's vehicles are designed to provide a dynamic and engaging driving experience, and their slogan, "The Ultimate Driving Machine," reflects that commitment. BMW produces a variety of cars that combine luxury with sporty performance.
|
||||
|
||||
3. Mercedes-Benz
|
||||
Mercedes-Benz is synonymous with luxury and quality. With a history dating back to the early 20th century, the brand is known for its elegant designs, innovative safety features, and high-quality engineering. Mercedes-Benz manufactures not only luxury sedans but also SUVs, sports cars, and commercial vehicles, catering to a wide range of needs.
|
||||
|
||||
4. Porsche
|
||||
Porsche is a name that stands for high-performance sports cars. Founded in 1931, the brand has become famous for models like the iconic Porsche 911. Porsche cars are celebrated for their speed, precision, and distinctive design, appealing to car enthusiasts who value both performance and style.
|
||||
|
||||
5. Volkswagen
|
||||
Volkswagen, which means "people's car" in German, was established with the idea of making affordable and reliable vehicles accessible to everyone. Over the years, Volkswagen has produced several iconic models, such as the Beetle and the Golf. Today, it remains one of the largest car manufacturers in the world, offering a wide range of vehicles that balance practicality with quality.
|
||||
|
||||
Each of these car manufacturer contributes to Germany's reputation as a leader in the global automotive industry, showcasing a blend of innovation, performance, and design excellence.
|
||||
"""
|
||||
|
||||
text_2 = """
|
||||
1. Apple
|
||||
Apple is renowned for its innovative consumer electronics and software. Its product lineup includes the iPhone, iPad, Mac computers, and wearables like the Apple Watch. Known for its emphasis on sleek design and user-friendly interfaces, Apple has built a loyal customer base and created a seamless ecosystem that integrates hardware, software, and services.
|
||||
|
||||
2. Google
|
||||
Founded in 1998, Google started as a search engine and quickly became the go-to resource for finding information online. Over the years, the company has diversified its offerings to include digital advertising, cloud computing, mobile operating systems (Android), and various web services like Gmail and Google Maps. Google's innovations have played a major role in shaping the internet landscape.
|
||||
|
||||
3. Microsoft
|
||||
Microsoft Corporation has been a dominant force in software for decades. Its Windows operating system and Microsoft Office suite are staples in both business and personal computing. In recent years, Microsoft has expanded into cloud computing with Azure, gaming with the Xbox platform, and even hardware through products like the Surface line. This evolution has helped the company maintain its relevance in a rapidly changing tech world.
|
||||
|
||||
4. Amazon
|
||||
What began as an online bookstore has grown into one of the largest e-commerce platforms globally. Amazon is known for its vast online marketplace, but its influence extends far beyond retail. With Amazon Web Services (AWS), the company has become a leader in cloud computing, offering robust solutions that power websites, applications, and businesses around the world. Amazon's constant drive for innovation continues to reshape both retail and technology sectors.
|
||||
|
||||
5. Meta
|
||||
Meta, originally known as Facebook, revolutionized social media by connecting billions of people worldwide. Beyond its core social networking service, Meta is investing in the next generation of digital experiences through virtual and augmented reality technologies, with projects like Oculus. The company's efforts signal a commitment to evolving digital interaction and building the metaverse—a shared virtual space where users can connect and collaborate.
|
||||
|
||||
Each of these companies has significantly impacted the technology landscape, driving innovation and transforming everyday life through their groundbreaking products and services.
|
||||
"""
|
||||
474
poetry.lock
generated
474
poetry.lock
generated
|
|
@ -857,14 +857,14 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.1.31"
|
||||
version = "2025.4.26"
|
||||
description = "Python package for providing Mozilla's CA Bundle."
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "certifi-2025.1.31-py3-none-any.whl", hash = "sha256:ca78db4565a652026a4db2bcdf68f2fb589ea80d0be70e03929ed730746b84fe"},
|
||||
{file = "certifi-2025.1.31.tar.gz", hash = "sha256:3d5da6925056f6f18f119200434a4780a94263f10d1c21d032a6f6b2baa20651"},
|
||||
{file = "certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3"},
|
||||
{file = "certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -1225,7 +1225,7 @@ description = "Cross-platform colored terminal text."
|
|||
optional = false
|
||||
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
|
||||
groups = ["main"]
|
||||
markers = "(sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"chromadb\") and (platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"chromadb\" or extra == \"codegraph\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"dev\" or extra == \"chromadb\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"dev\" or extra == \"chromadb\" or extra == \"codegraph\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"dev\") and (python_version < \"3.13\" or platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"chromadb\")"
|
||||
markers = "(sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"chromadb\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\") and (platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"chromadb\" or extra == \"codegraph\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"dev\" or extra == \"chromadb\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"dev\") and (sys_platform == \"win32\" or platform_system == \"Windows\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"dev\" or extra == \"chromadb\" or extra == \"codegraph\") and (python_version < \"3.13\" or platform_system == \"Windows\" or extra == \"notebook\" or extra == \"dev\" or extra == \"llama-index\" or extra == \"deepeval\" or extra == \"chromadb\")"
|
||||
files = [
|
||||
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
|
||||
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
|
||||
|
|
@ -1238,7 +1238,7 @@ description = "Colored terminal output for Python's logging module"
|
|||
optional = true
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or extra == \"chromadb\" or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
markers = "python_version == \"3.10\" and (extra == \"codegraph\" or extra == \"chromadb\") or extra == \"chromadb\" or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
files = [
|
||||
{file = "coloredlogs-15.0.1-py2.py3-none-any.whl", hash = "sha256:612ee75c546f53e92e70049c9dbfcc18c935a2b9a53b66085ce9ef6a6e5c0934"},
|
||||
{file = "coloredlogs-15.0.1.tar.gz", hash = "sha256:7c991aa71a4577af2f82600d8f8f3a89f936baeaf9b50a9c197da014e5bf16b0"},
|
||||
|
|
@ -1420,6 +1420,9 @@ files = [
|
|||
{file = "coverage-7.8.0.tar.gz", hash = "sha256:7a3d62b3b03b4b6fd41a085f3574874cf946cb4604d2b4d3e8dca8cd570ca501"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
tomli = {version = "*", optional = true, markers = "python_full_version <= \"3.11.0a6\" and extra == \"toml\""}
|
||||
|
||||
[package.extras]
|
||||
toml = ["tomli ; python_full_version <= \"3.11.0a6\""]
|
||||
|
||||
|
|
@ -2295,7 +2298,7 @@ description = "The FlatBuffers serialization format for Python"
|
|||
optional = true
|
||||
python-versions = "*"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or extra == \"chromadb\" or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
markers = "python_version == \"3.10\" and (extra == \"codegraph\" or extra == \"chromadb\") or extra == \"chromadb\" or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
files = [
|
||||
{file = "flatbuffers-25.2.10-py2.py3-none-any.whl", hash = "sha256:ebba5f4d5ea615af3f7fd70fc310636fbb2bbd1f566ac0a23d98dd412de50051"},
|
||||
{file = "flatbuffers-25.2.10.tar.gz", hash = "sha256:97e451377a41262f8d9bd4295cc836133415cc03d8cb966410a4af92eb00d26e"},
|
||||
|
|
@ -2690,15 +2693,15 @@ grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0.dev0)"]
|
|||
|
||||
[[package]]
|
||||
name = "google-api-python-client"
|
||||
version = "2.167.0"
|
||||
version = "2.169.0"
|
||||
description = "Google API Client Library for Python"
|
||||
optional = true
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"gemini\""
|
||||
files = [
|
||||
{file = "google_api_python_client-2.167.0-py2.py3-none-any.whl", hash = "sha256:ce25290cc229505d770ca5c8d03850e0ae87d8e998fc6dd743ecece018baa396"},
|
||||
{file = "google_api_python_client-2.167.0.tar.gz", hash = "sha256:a458d402572e1c2caf9db090d8e7b270f43ff326bd9349c731a86b19910e3995"},
|
||||
{file = "google_api_python_client-2.169.0-py3-none-any.whl", hash = "sha256:dae3e882dc0e6f28e60cf09c1f13fedfd881db84f824dd418aa9e44def2fe00d"},
|
||||
{file = "google_api_python_client-2.169.0.tar.gz", hash = "sha256:0585bb97bd5f5bf3ed8d4bf624593e4c5a14d06c811d1952b07a1f94b4d12c51"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3164,14 +3167,14 @@ tornado = ["tornado (>=0.2)"]
|
|||
|
||||
[[package]]
|
||||
name = "h11"
|
||||
version = "0.14.0"
|
||||
version = "0.16.0"
|
||||
description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"},
|
||||
{file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"},
|
||||
{file = "h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86"},
|
||||
{file = "h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -3273,19 +3276,19 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "httpcore"
|
||||
version = "1.0.8"
|
||||
version = "1.0.9"
|
||||
description = "A minimal low-level HTTP client."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "httpcore-1.0.8-py3-none-any.whl", hash = "sha256:5254cf149bcb5f75e9d1b2b9f729ea4a4b883d1ad7379fc632b727cec23674be"},
|
||||
{file = "httpcore-1.0.8.tar.gz", hash = "sha256:86e94505ed24ea06514883fd44d2bc02d90e77e7979c8eb71b90f41d364a1bad"},
|
||||
{file = "httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55"},
|
||||
{file = "httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
certifi = "*"
|
||||
h11 = ">=0.13,<0.15"
|
||||
h11 = ">=0.16"
|
||||
|
||||
[package.extras]
|
||||
asyncio = ["anyio (>=4.0,<5.0)"]
|
||||
|
|
@ -3448,7 +3451,7 @@ description = "Human friendly output for text interfaces using Python"
|
|||
optional = true
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or extra == \"chromadb\" or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
markers = "python_version == \"3.10\" and (extra == \"codegraph\" or extra == \"chromadb\") or extra == \"chromadb\" or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
files = [
|
||||
{file = "humanfriendly-10.0-py2.py3-none-any.whl", hash = "sha256:1697e1a8a8f550fd43c2865cd84542fc175a61dcb779b6fee18cf6b6ccba1477"},
|
||||
{file = "humanfriendly-10.0.tar.gz", hash = "sha256:6b0b831ce8f15f7300721aa49829fc4e83921a9a301cc7f606be6686a2288ddc"},
|
||||
|
|
@ -3625,15 +3628,15 @@ test = ["flaky", "ipyparallel", "pre-commit", "pytest (>=7.0)", "pytest-asyncio
|
|||
|
||||
[[package]]
|
||||
name = "ipython"
|
||||
version = "8.35.0"
|
||||
version = "8.36.0"
|
||||
description = "IPython: Productive Interactive Computing"
|
||||
optional = true
|
||||
python-versions = ">=3.10"
|
||||
groups = ["main"]
|
||||
markers = "python_version < \"3.11\" and (extra == \"notebook\" or extra == \"dev\")"
|
||||
files = [
|
||||
{file = "ipython-8.35.0-py3-none-any.whl", hash = "sha256:e6b7470468ba6f1f0a7b116bb688a3ece2f13e2f94138e508201fad677a788ba"},
|
||||
{file = "ipython-8.35.0.tar.gz", hash = "sha256:d200b7d93c3f5883fc36ab9ce28a18249c7706e51347681f80a0aef9895f2520"},
|
||||
{file = "ipython-8.36.0-py3-none-any.whl", hash = "sha256:12b913914d010dcffa2711505ec8be4bf0180742d97f1e5175e51f22086428c1"},
|
||||
{file = "ipython-8.36.0.tar.gz", hash = "sha256:24658e9fe5c5c819455043235ba59cfffded4a35936eefceceab6b192f7092ff"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -3665,15 +3668,15 @@ test-extra = ["curio", "ipython[test]", "jupyter_ai", "matplotlib (!=3.2.0)", "n
|
|||
|
||||
[[package]]
|
||||
name = "ipython"
|
||||
version = "9.1.0"
|
||||
version = "9.2.0"
|
||||
description = "IPython: Productive Interactive Computing"
|
||||
optional = true
|
||||
python-versions = ">=3.11"
|
||||
groups = ["main"]
|
||||
markers = "python_version >= \"3.11\" and (extra == \"notebook\" or extra == \"dev\")"
|
||||
files = [
|
||||
{file = "ipython-9.1.0-py3-none-any.whl", hash = "sha256:2df07257ec2f84a6b346b8d83100bcf8fa501c6e01ab75cd3799b0bb253b3d2a"},
|
||||
{file = "ipython-9.1.0.tar.gz", hash = "sha256:a47e13a5e05e02f3b8e1e7a0f9db372199fe8c3763532fe7a1e0379e4e135f16"},
|
||||
{file = "ipython-9.2.0-py3-none-any.whl", hash = "sha256:fef5e33c4a1ae0759e0bba5917c9db4eb8c53fee917b6a526bd973e1ca5159f6"},
|
||||
{file = "ipython-9.2.0.tar.gz", hash = "sha256:62a9373dbc12f28f9feaf4700d052195bf89806279fc8ca11f3f54017d04751b"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -4593,15 +4596,15 @@ tenacity = ">=8.1.0,<8.4.0 || >8.4.0,<10"
|
|||
|
||||
[[package]]
|
||||
name = "langchain-core"
|
||||
version = "0.3.55"
|
||||
version = "0.3.56"
|
||||
description = "Building applications with LLMs through composability"
|
||||
optional = true
|
||||
python-versions = "<4.0,>=3.9"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"langchain\" or extra == \"deepeval\""
|
||||
files = [
|
||||
{file = "langchain_core-0.3.55-py3-none-any.whl", hash = "sha256:b3cb36bf37755a616158a79866657c6697b43a2f7c69dd723ce425f1c76c1baa"},
|
||||
{file = "langchain_core-0.3.55.tar.gz", hash = "sha256:0f2b3e311621116a83510c70b0ac9d959030a0a457a69483535cff18501fedc9"},
|
||||
{file = "langchain_core-0.3.56-py3-none-any.whl", hash = "sha256:a20c6aca0fa0da265d96d3b14a5a01828ac5d2d9d27516434873d76f2d4839ed"},
|
||||
{file = "langchain_core-0.3.56.tar.gz", hash = "sha256:de896585bc56e12652327dcd195227c3739a07e86e587c91a07101e0df11dffe"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -4752,14 +4755,13 @@ valkey = ["valkey (>=6)"]
|
|||
|
||||
[[package]]
|
||||
name = "litellm"
|
||||
version = "1.67.2"
|
||||
version = "1.67.4.post1"
|
||||
description = "Library to easily interface with LLM API providers"
|
||||
optional = false
|
||||
python-versions = "!=2.7.*,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,>=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "litellm-1.67.2-py3-none-any.whl", hash = "sha256:32df4d17b3ead17d04793311858965e41e83a7bdf9bd661895c0e6bc9c78dc8b"},
|
||||
{file = "litellm-1.67.2.tar.gz", hash = "sha256:9e108827bff16af04fd4c35b0c1a1d6c7746c96db3870189a60141d449797487"},
|
||||
{file = "litellm-1.67.4.post1.tar.gz", hash = "sha256:057f2505f82d8c3f83d705c375b0d1931de998b13e239a6b06e16ee351fda648"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -4777,19 +4779,19 @@ tokenizers = "*"
|
|||
|
||||
[package.extras]
|
||||
extra-proxy = ["azure-identity (>=1.15.0,<2.0.0)", "azure-keyvault-secrets (>=4.8.0,<5.0.0)", "google-cloud-kms (>=2.21.3,<3.0.0)", "prisma (==0.11.0)", "redisvl (>=0.4.1,<0.5.0) ; python_version >= \"3.9\" and python_version < \"3.14\"", "resend (>=0.8.0,<0.9.0)"]
|
||||
proxy = ["PyJWT (>=2.8.0,<3.0.0)", "apscheduler (>=3.10.4,<4.0.0)", "backoff", "boto3 (==1.34.34)", "cryptography (>=43.0.1,<44.0.0)", "fastapi (>=0.115.5,<0.116.0)", "fastapi-sso (>=0.16.0,<0.17.0)", "gunicorn (>=23.0.0,<24.0.0)", "litellm-proxy-extras (==0.1.11)", "mcp (==1.5.0) ; python_version >= \"3.10\"", "orjson (>=3.9.7,<4.0.0)", "pynacl (>=1.5.0,<2.0.0)", "python-multipart (>=0.0.18,<0.0.19)", "pyyaml (>=6.0.1,<7.0.0)", "rq", "uvicorn (>=0.29.0,<0.30.0)", "uvloop (>=0.21.0,<0.22.0)", "websockets (>=13.1.0,<14.0.0)"]
|
||||
proxy = ["PyJWT (>=2.8.0,<3.0.0)", "apscheduler (>=3.10.4,<4.0.0)", "backoff", "boto3 (==1.34.34)", "cryptography (>=43.0.1,<44.0.0)", "fastapi (>=0.115.5,<0.116.0)", "fastapi-sso (>=0.16.0,<0.17.0)", "gunicorn (>=23.0.0,<24.0.0)", "litellm-proxy-extras (==0.1.13)", "mcp (==1.5.0) ; python_version >= \"3.10\"", "orjson (>=3.9.7,<4.0.0)", "pynacl (>=1.5.0,<2.0.0)", "python-multipart (>=0.0.18,<0.0.19)", "pyyaml (>=6.0.1,<7.0.0)", "rq", "uvicorn (>=0.29.0,<0.30.0)", "uvloop (>=0.21.0,<0.22.0)", "websockets (>=13.1.0,<14.0.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "llama-cloud"
|
||||
version = "0.1.18"
|
||||
version = "0.1.19"
|
||||
description = ""
|
||||
optional = true
|
||||
python-versions = "<4,>=3.8"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"deepeval\""
|
||||
files = [
|
||||
{file = "llama_cloud-0.1.18-py3-none-any.whl", hash = "sha256:5842722a0c3033afa930b4a50d43e6f1e77ff1dab12383a769dc51a15fb87c9b"},
|
||||
{file = "llama_cloud-0.1.18.tar.gz", hash = "sha256:65cb88b1cb1a3a0e63e4438e8c8a2e6013dfdafbb4201d274c0459e5d04fb328"},
|
||||
{file = "llama_cloud-0.1.19-py3-none-any.whl", hash = "sha256:d2d551baa4b63f7717f8e04cbb81b0f817e5450a66870c5487dd371f81dab8ec"},
|
||||
{file = "llama_cloud-0.1.19.tar.gz", hash = "sha256:b0a5424ae0099ca27df2a2d7e5aec99066de9ca860ab65987c9f931f1ea7abff"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -4799,20 +4801,20 @@ pydantic = ">=1.10"
|
|||
|
||||
[[package]]
|
||||
name = "llama-cloud-services"
|
||||
version = "0.6.15"
|
||||
version = "0.6.20"
|
||||
description = "Tailored SDK clients for LlamaCloud services."
|
||||
optional = true
|
||||
python-versions = "<4.0,>=3.9"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"deepeval\""
|
||||
files = [
|
||||
{file = "llama_cloud_services-0.6.15-py3-none-any.whl", hash = "sha256:c4e24dd41f2cde17eeba7750d41cc70fe26e1179c03ae832122d762572e53de6"},
|
||||
{file = "llama_cloud_services-0.6.15.tar.gz", hash = "sha256:912799d9cdcf48074145c6781f40a6dd7dadb6344ecb30b715407db85a0e675e"},
|
||||
{file = "llama_cloud_services-0.6.20-py3-none-any.whl", hash = "sha256:f6878f602551112b0d7520b891dc88281ba6135285aaa4aaa89e55d684e7fb1f"},
|
||||
{file = "llama_cloud_services-0.6.20.tar.gz", hash = "sha256:1cba4d33e9c40eaa3cbf50e924c7b6371ba73be02830a615fcf86245c4f3a142"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
click = ">=8.1.7,<9.0.0"
|
||||
llama-cloud = ">=0.1.18,<0.2.0"
|
||||
llama-cloud = "0.1.19"
|
||||
llama-index-core = ">=0.11.0"
|
||||
platformdirs = ">=4.3.7,<5.0.0"
|
||||
pydantic = "!=2.10"
|
||||
|
|
@ -5065,19 +5067,19 @@ llama-parse = ">=0.5.0"
|
|||
|
||||
[[package]]
|
||||
name = "llama-parse"
|
||||
version = "0.6.12"
|
||||
version = "0.6.20"
|
||||
description = "Parse files into RAG-Optimized formats."
|
||||
optional = true
|
||||
python-versions = "<4.0,>=3.9"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"deepeval\""
|
||||
files = [
|
||||
{file = "llama_parse-0.6.12-py3-none-any.whl", hash = "sha256:2dd1c74b0cba1a2bc300286f6b91a650f6ddc396acfce3497ba3d72d43c53fac"},
|
||||
{file = "llama_parse-0.6.12.tar.gz", hash = "sha256:c99593fb955c338a69e64a2ec449e09753afe6dcff239ab050989fda74839867"},
|
||||
{file = "llama_parse-0.6.20-py3-none-any.whl", hash = "sha256:7f99d42a0fa70530d0560d1235bba14a7c198c8db8c0fddd80c32ad61b1d5b10"},
|
||||
{file = "llama_parse-0.6.20.tar.gz", hash = "sha256:73aabf19229a6d8f77c79864c39e8bda0b0acef6eb1f8f65fed656d875259e7c"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
llama-cloud-services = ">=0.6.12"
|
||||
llama-cloud-services = ">=0.6.20"
|
||||
|
||||
[[package]]
|
||||
name = "loguru"
|
||||
|
|
@ -5952,7 +5954,7 @@ description = "Python library for arbitrary-precision floating-point arithmetic"
|
|||
optional = true
|
||||
python-versions = "*"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or extra == \"chromadb\" or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
markers = "python_version == \"3.10\" and (extra == \"codegraph\" or extra == \"chromadb\") or extra == \"chromadb\" or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
files = [
|
||||
{file = "mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c"},
|
||||
{file = "mpmath-1.3.0.tar.gz", hash = "sha256:7a28eb2a9774d00c7bc92411c19a89209d5da7c4c9a9e227be8330a23a25b91f"},
|
||||
|
|
@ -6151,15 +6153,15 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "narwhals"
|
||||
version = "1.36.0"
|
||||
version = "1.37.1"
|
||||
description = "Extremely lightweight compatibility layer between dataframe libraries"
|
||||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"evals\""
|
||||
files = [
|
||||
{file = "narwhals-1.36.0-py3-none-any.whl", hash = "sha256:e3c50dd1d769bc145f57ae17c1f0f0da6c3d397d62cdd0bb167e9b618e95c9d6"},
|
||||
{file = "narwhals-1.36.0.tar.gz", hash = "sha256:7cd860e7e066609bd8a042bb5b8e4193275532114448210a91cbd5c622b6e5eb"},
|
||||
{file = "narwhals-1.37.1-py3-none-any.whl", hash = "sha256:6f358a23b7351897d6efb45496dc0528918ce4ca6c8f9631594885cd873576a7"},
|
||||
{file = "narwhals-1.37.1.tar.gz", hash = "sha256:1eb8f17ff00e6c471d5afb704e9068f41657234eb73bde2ee66ad975a170015b"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
|
|
@ -6172,6 +6174,7 @@ pandas = ["pandas (>=0.25.3)"]
|
|||
polars = ["polars (>=0.20.3)"]
|
||||
pyarrow = ["pyarrow (>=11.0.0)"]
|
||||
pyspark = ["pyspark (>=3.5.0)"]
|
||||
pyspark-connect = ["pyspark[connect] (>=3.5.0)"]
|
||||
sqlframe = ["sqlframe (>=3.22.0)"]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -6533,7 +6536,7 @@ description = "ONNX Runtime is a runtime accelerator for Machine Learning models
|
|||
optional = true
|
||||
python-versions = ">=3.10"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or extra == \"chromadb\" or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
markers = "python_version == \"3.10\" and (extra == \"codegraph\" or extra == \"chromadb\") or extra == \"chromadb\" or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
files = [
|
||||
{file = "onnxruntime-1.21.1-cp310-cp310-macosx_13_0_universal2.whl", hash = "sha256:daedb5d33d8963062a25f4a3c788262074587f685a19478ef759a911b4b12c25"},
|
||||
{file = "onnxruntime-1.21.1-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a402f9bda0b1cc791d9cf31d23c471e8189a55369b49ef2b9d0854eb11d22c4"},
|
||||
|
|
@ -6565,14 +6568,14 @@ sympy = "*"
|
|||
|
||||
[[package]]
|
||||
name = "openai"
|
||||
version = "1.76.0"
|
||||
version = "1.76.1"
|
||||
description = "The official Python library for the openai API"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "openai-1.76.0-py3-none-any.whl", hash = "sha256:a712b50e78cf78e6d7b2a8f69c4978243517c2c36999756673e07a14ce37dc0a"},
|
||||
{file = "openai-1.76.0.tar.gz", hash = "sha256:fd2bfaf4608f48102d6b74f9e11c5ecaa058b60dad9c36e409c12477dfd91fb2"},
|
||||
{file = "openai-1.76.1-py3-none-any.whl", hash = "sha256:2bb420f882dd9212ed6787d479390b1f556e99749e37c7f57fcc02924f7cd1a5"},
|
||||
{file = "openai-1.76.1.tar.gz", hash = "sha256:201a0357e20c3c26cf05aaa711363d151794f645d378d6c46a4730d147a399ad"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -6730,81 +6733,85 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "orjson"
|
||||
version = "3.10.16"
|
||||
version = "3.10.17"
|
||||
description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
markers = "platform_python_implementation != \"PyPy\""
|
||||
files = [
|
||||
{file = "orjson-3.10.16-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:4cb473b8e79154fa778fb56d2d73763d977be3dcc140587e07dbc545bbfc38f8"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:622a8e85eeec1948690409a19ca1c7d9fd8ff116f4861d261e6ae2094fe59a00"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c682d852d0ce77613993dc967e90e151899fe2d8e71c20e9be164080f468e370"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8c520ae736acd2e32df193bcff73491e64c936f3e44a2916b548da048a48b46b"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:134f87c76bfae00f2094d85cfab261b289b76d78c6da8a7a3b3c09d362fd1e06"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b59afde79563e2cf37cfe62ee3b71c063fd5546c8e662d7fcfc2a3d5031a5c4c"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:113602f8241daaff05d6fad25bd481d54c42d8d72ef4c831bb3ab682a54d9e15"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4fc0077d101f8fab4031e6554fc17b4c2ad8fdbc56ee64a727f3c95b379e31da"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:9c6bf6ff180cd69e93f3f50380224218cfab79953a868ea3908430bcfaf9cb5e"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:5673eadfa952f95a7cd76418ff189df11b0a9c34b1995dff43a6fdbce5d63bf4"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5fe638a423d852b0ae1e1a79895851696cb0d9fa0946fdbfd5da5072d9bb9551"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-win32.whl", hash = "sha256:33af58f479b3c6435ab8f8b57999874b4b40c804c7a36b5cc6b54d8f28e1d3dd"},
|
||||
{file = "orjson-3.10.16-cp310-cp310-win_amd64.whl", hash = "sha256:0338356b3f56d71293c583350af26f053017071836b07e064e92819ecf1aa055"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:44fcbe1a1884f8bc9e2e863168b0f84230c3d634afe41c678637d2728ea8e739"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:78177bf0a9d0192e0b34c3d78bcff7fe21d1b5d84aeb5ebdfe0dbe637b885225"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:12824073a010a754bb27330cad21d6e9b98374f497f391b8707752b96f72e741"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ddd41007e56284e9867864aa2f29f3136bb1dd19a49ca43c0b4eda22a579cf53"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0877c4d35de639645de83666458ca1f12560d9fa7aa9b25d8bb8f52f61627d14"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9a09a539e9cc3beead3e7107093b4ac176d015bec64f811afb5965fce077a03c"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:31b98bc9b40610fec971d9a4d67bb2ed02eec0a8ae35f8ccd2086320c28526ca"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0ce243f5a8739f3a18830bc62dc2e05b69a7545bafd3e3249f86668b2bcd8e50"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:64792c0025bae049b3074c6abe0cf06f23c8e9f5a445f4bab31dc5ca23dbf9e1"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:ea53f7e68eec718b8e17e942f7ca56c6bd43562eb19db3f22d90d75e13f0431d"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a741ba1a9488c92227711bde8c8c2b63d7d3816883268c808fbeada00400c164"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-win32.whl", hash = "sha256:c7ed2c61bb8226384c3fdf1fb01c51b47b03e3f4536c985078cccc2fd19f1619"},
|
||||
{file = "orjson-3.10.16-cp311-cp311-win_amd64.whl", hash = "sha256:cd67d8b3e0e56222a2e7b7f7da9031e30ecd1fe251c023340b9f12caca85ab60"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:6d3444abbfa71ba21bb042caa4b062535b122248259fdb9deea567969140abca"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:30245c08d818fdcaa48b7d5b81499b8cae09acabb216fe61ca619876b128e184"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0ba1d0baa71bf7579a4ccdcf503e6f3098ef9542106a0eca82395898c8a500a"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb0beefa5ef3af8845f3a69ff2a4aa62529b5acec1cfe5f8a6b4141033fd46ef"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6daa0e1c9bf2e030e93c98394de94506f2a4d12e1e9dadd7c53d5e44d0f9628e"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9da9019afb21e02410ef600e56666652b73eb3e4d213a0ec919ff391a7dd52aa"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:daeb3a1ee17b69981d3aae30c3b4e786b0f8c9e6c71f2b48f1aef934f63f38f4"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80fed80eaf0e20a31942ae5d0728849862446512769692474be5e6b73123a23b"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:73390ed838f03764540a7bdc4071fe0123914c2cc02fb6abf35182d5fd1b7a42"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:a22bba012a0c94ec02a7768953020ab0d3e2b884760f859176343a36c01adf87"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5385bbfdbc90ff5b2635b7e6bebf259652db00a92b5e3c45b616df75b9058e88"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:02c6279016346e774dd92625d46c6c40db687b8a0d685aadb91e26e46cc33e1e"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-win32.whl", hash = "sha256:7ca55097a11426db80f79378e873a8c51f4dde9ffc22de44850f9696b7eb0e8c"},
|
||||
{file = "orjson-3.10.16-cp312-cp312-win_amd64.whl", hash = "sha256:86d127efdd3f9bf5f04809b70faca1e6836556ea3cc46e662b44dab3fe71f3d6"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:148a97f7de811ba14bc6dbc4a433e0341ffd2cc285065199fb5f6a98013744bd"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:1d960c1bf0e734ea36d0adc880076de3846aaec45ffad29b78c7f1b7962516b8"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a318cd184d1269f68634464b12871386808dc8b7c27de8565234d25975a7a137"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:df23f8df3ef9223d1d6748bea63fca55aae7da30a875700809c500a05975522b"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b94dda8dd6d1378f1037d7f3f6b21db769ef911c4567cbaa962bb6dc5021cf90"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f12970a26666a8775346003fd94347d03ccb98ab8aa063036818381acf5f523e"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:15a1431a245d856bd56e4d29ea0023eb4d2c8f71efe914beb3dee8ab3f0cd7fb"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c83655cfc247f399a222567d146524674a7b217af7ef8289c0ff53cfe8db09f0"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:fa59ae64cb6ddde8f09bdbf7baf933c4cd05734ad84dcf4e43b887eb24e37652"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:ca5426e5aacc2e9507d341bc169d8af9c3cbe88f4cd4c1cf2f87e8564730eb56"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:6fd5da4edf98a400946cd3a195680de56f1e7575109b9acb9493331047157430"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:980ecc7a53e567169282a5e0ff078393bac78320d44238da4e246d71a4e0e8f5"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-win32.whl", hash = "sha256:28f79944dd006ac540a6465ebd5f8f45dfdf0948ff998eac7a908275b4c1add6"},
|
||||
{file = "orjson-3.10.16-cp313-cp313-win_amd64.whl", hash = "sha256:fe0a145e96d51971407cb8ba947e63ead2aa915db59d6631a355f5f2150b56b7"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:c35b5c1fb5a5d6d2fea825dec5d3d16bea3c06ac744708a8e1ff41d4ba10cdf1"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c9aac7ecc86218b4b3048c768f227a9452287001d7548500150bb75ee21bf55d"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6e19f5102fff36f923b6dfdb3236ec710b649da975ed57c29833cb910c5a73ab"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:17210490408eb62755a334a6f20ed17c39f27b4f45d89a38cd144cd458eba80b"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fbbe04451db85916e52a9f720bd89bf41f803cf63b038595674691680cbebd1b"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a966eba501a3a1f309f5a6af32ed9eb8f316fa19d9947bac3e6350dc63a6f0a"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01e0d22f06c81e6c435723343e1eefc710e0510a35d897856766d475f2a15687"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7c1e602d028ee285dbd300fb9820b342b937df64d5a3336e1618b354e95a2569"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:d230e5020666a6725629df81e210dc11c3eae7d52fe909a7157b3875238484f3"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:0f8baac07d4555f57d44746a7d80fbe6b2c4fe2ed68136b4abb51cfec512a5e9"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:524e48420b90fc66953e91b660b3d05faaf921277d6707e328fde1c218b31250"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-win32.whl", hash = "sha256:a9f614e31423d7292dbca966a53b2d775c64528c7d91424ab2747d8ab8ce5c72"},
|
||||
{file = "orjson-3.10.16-cp39-cp39-win_amd64.whl", hash = "sha256:c338dc2296d1ed0d5c5c27dfb22d00b330555cb706c2e0be1e1c3940a0895905"},
|
||||
{file = "orjson-3.10.16.tar.gz", hash = "sha256:d2aaa5c495e11d17b9b93205f5fa196737ee3202f000aaebf028dc9a73750f10"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:bc399cf138a0201d0bf2399b44195d33a0a5aee149dab114340da0d766c88b95"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:59225b27b72e0e1626d869f7b987da6c74f9b6026cf9a87c1cdaf74ca9f7b8c0"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6a749b52b6ae9ac1b937a623627c40be4d4aa9ee28e8957b8e75dca4c655719f"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7db1fa1e207451fc317c09fa1cb96b56656b6d9f1b03c93a2bb7624a0dd83528"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e56dc219ae433db1a8ff0e0170102c5cc7da6d1386ae6d31ed0810faa540939d"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2ca889e2ed89b83e749756f9f5c248d33681398e9fe79da497d147d952e34d1f"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1978381c3c7bcf549150c272f8f2b77ed8738a1e4086042ac2896d19137938e9"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:df9b2f34aedd852a1e89323863ed810aa684084eae07e4d8ea8148bd1da9e4a8"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:85c64c9468ea7f52d1944716907c2657643dd608f41fd700f029f6d149fefa16"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:dd96d7173cfc0973070644839d9b6c9fdcdf22672b700f996d304606a1d1326a"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:521f58b2ccb705926d0782b468ea7d2695c8e9c3b91a02de9ad521cba7fcfb47"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-win32.whl", hash = "sha256:f1182e24ba7c788cea9d907133ae66206cf3cce1fb13b4b00cc0c49749f59908"},
|
||||
{file = "orjson-3.10.17-cp310-cp310-win_amd64.whl", hash = "sha256:8aa1685aba1927168ed0c675146e8a53c81a815a282f57af1f024f6c72d76caa"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:4cc5adc211a459e53e95c4080154a1ef022c1ef887896de87947355e66c5ebcc"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:3af86347748765b468d0025d4714a439a456e6919428298618c552a6a9798652"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c61177e88695e557f18ddc9f1f0d68ed84516b1d07353a1bfe1775b66f0c2cc"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:85662edf27e21282f3bde8a597d50b81f4b244e5028005a2f26878f42167accb"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1793b5b521921e038828224340830e1babd02acaa3a840b315c574f5b0e4caa0"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f337df3e1376f2d81384c18bd8206acc3909922c7c6c7fa15db1aeb9589f7dc2"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81f8da1dfdd5e2f849e207c7fcc35440683754668320689fcd2805407aa1a9d1"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f4412311af2b9a5e5266ce5a4f1fd2d1107eaf9a9e145108f0c4dbe22551434"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:80c5dc16427d3ea25b2b865f7e2161e20867eff91b79b36be2066b91f43a9d20"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:40790c8a4a5333add6fafb0d0484dfecb8b2af9f0629724a8c63ec3d34f250fd"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:6c2bee95b2260e2a364ad9410c16b337127e1161cee5e8f66019f2b93bfe649e"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:8a9b32907fca449aa5bf0d3cf7c81c80b2b8d73accddaa2be781e3bf6e0eb0e0"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-win32.whl", hash = "sha256:287eed8af86436697d974797bf0d1fde56c7dee696e99765c8f499967d74c014"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-win_amd64.whl", hash = "sha256:7ebde38a8065b2dec297f320e58094b994bf244f643b55bacaa0c729b3efc3aa"},
|
||||
{file = "orjson-3.10.17-cp311-cp311-win_arm64.whl", hash = "sha256:3bc0cb7050d218111cf9239f74b247dcccc17569c3a16f09f6ccd51f06f94be6"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:70ea106d9450c71f969865eeb44132c68b716d2e68ee87add8f2d60651f748ec"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:37595318ac4ffb7d75b8d025b48ebe9203243bceb216a3b648d91c806163de35"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6addbf04eb07e51e7b4dba7ed5a2608ad45085c4abeacb133872efd37951d4cb"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9ad6cf08e41dddc709fdf2dedfb1e4187e49e72436be71805003531037c2bcd6"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fb1c9b14bd9c981ce62f0002d9798eca3c512b6931e5c050b3708a84fac8f7f4"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c11841d5b5e1e5b1fed1b0ad177725b65a7ee73941189b9994ed98cf2fa03230"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d8c8d5eb32d3276c7798b23bd2a97d8940f86ca4ae909d7528ab497d9ead0b04"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cee320b0c353c7bdd69c4e60b8e0329e5eaf085254904501f4f1d41a0e1077ba"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a4aac9de5b6cb356c56fa488c8ef77802f61dbdaa6a5fe9178edb750742d536f"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:a00b3252a7f337f2a33ce6ae0cb7923921f8024c6cd834d02b75be193c1efcee"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:8924df38e1a8e4c0d94076ff9bc0d459cb2df6ba3627ec4355842da99a400a97"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b275ccd7b90a44e4624faaf675e45c959692f999e084c678ddd9d15e81a2321e"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-win32.whl", hash = "sha256:ae36abe17765dbe7cc4b46303bb093594739703ece97fd87b6bd421ea26d1c32"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-win_amd64.whl", hash = "sha256:53fdb906f7a22b9e6ece0b1bc5b5d7cccfb753646907c35383ecf6237b16d42a"},
|
||||
{file = "orjson-3.10.17-cp312-cp312-win_arm64.whl", hash = "sha256:b1d17742f7ee4487103e9cbc3356d283291bab1956c511720ffca54c0befb255"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:413d07f2b082e74988b70bcfc17ffc7bd90164e1e7214705fc9ace4e114a77e7"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:2fa4a286aca6d15ebf7d55261d9be876108d5f3885ffed93a6ae920d226fa2da"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a3ffa7729b3fb99ae9e4ed7b82b34990eb90cb992c80af80d3a5403ce67b181f"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:610127319105dbac9dec43676116aee109c3dd22461c33fddcfbebb828407c12"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:05a9f740f2ab76d7b55a9cddd1a9f6dd0301256302923a232cca2c5002ec586a"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:37da89968c311153c53471c2f49fc0fba41fcd6129a90208158ecd3172cf5647"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c7fcbf27de800c124667daa22b4902d8bdcd42e99382960f90476f1e8b036dc"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3af3e11abf8bd77a2b82863eeedd847c694fd64a4f9cb6b1644015a5aef44b84"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f76d0ab83bb30d393353e2ed211ab99eb3e6a71574fb1c662bca33ee39621"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:063c2f1511113de8bcf014d835f7284a64c250518588a034a39a242990fb4408"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:599c7107d913a2d9770d33abd034d1b869733453283d41f7cc9404710382b40f"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9fa216d28c4ec30b352a3856d8dcd1eb5b97d38b643b68e93bd74e9925080abe"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-win32.whl", hash = "sha256:094b1621e364f8717f8df796a1fa50532d9d506439952009d0a7dba66f3bb1d0"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-win_amd64.whl", hash = "sha256:1fdb277021d1a2d0a732f4ee375f1183482ed3f0d4aa95cec5aaa70a231b6599"},
|
||||
{file = "orjson-3.10.17-cp313-cp313-win_arm64.whl", hash = "sha256:ddb695dcd773a2363a03d8fd66e9fc0c10a98c1e51b980b63118bdb20ab2f071"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:dc711f86bfb2c5ec43b46e8dca01f4f4afb3c5cb7f6c3bef504392d3e22c5baf"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c4e09cddfa79bbc13045c4137193c85dceb83f82bec5c624fcefc59dab7ac03"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1b82b042863eb7fa6fddb5b92fb9c09aad5080aaf9806e45e4350e3b32083ed0"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:94af613e4f4233bb7fa919f55cf1ac558aedf734e4f41cc370fde267cc2e7b5b"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:04b08a5deac8e45fc908240597159e6d74763f3c6b3cfa2830bc05c484359399"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c5eab7f0c7ab0d1aba8b78e3062a6bac31a046d36f42b7285aaf6e4eef529a7"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c51514ffea808d1e190f6b2f1e00d65e7ab910b71d9c49f76058e60cb0a67774"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:f7368a97cba7d83cf09f7a1b352779a6070d6c92055fa703bcd313a96a73e8ea"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:28ac8f75113a927051bf3967ab41be0dfebb6c3595dfbd13f2d4d78db6cafce1"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:027ec92683681f3390c05363eb625cae1ceb721ff59999dd4dc37fd382073133"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:ff9896c0b7c6c74b2882eb2d7ae18a54defe6c9a34678ca52449997990b05ac9"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-win32.whl", hash = "sha256:634d7a84e68cf37ae2dc51c24b6fbd1f0fbc802a9253364ff65652a57b921235"},
|
||||
{file = "orjson-3.10.17-cp39-cp39-win_amd64.whl", hash = "sha256:dadfb27fb5385e6ed25d0914d2bd9e5e85785c65e9dce60598cb4cad3c8053fc"},
|
||||
{file = "orjson-3.10.17.tar.gz", hash = "sha256:28eeae6a15243966962b658dfcf7bae9e7bb1f3260dfcf0370dbd41f5ff6058b"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -6929,8 +6936,8 @@ files = [
|
|||
[package.dependencies]
|
||||
numpy = [
|
||||
{version = ">=1.22.4", markers = "python_version < \"3.11\""},
|
||||
{version = ">=1.26.0", markers = "python_version >= \"3.12\""},
|
||||
{version = ">=1.23.2", markers = "python_version == \"3.11\""},
|
||||
{version = ">=1.26.0", markers = "python_version >= \"3.12\""},
|
||||
]
|
||||
python-dateutil = ">=2.8.2"
|
||||
pytz = ">=2020.1"
|
||||
|
|
@ -7549,7 +7556,7 @@ description = ""
|
|||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and extra == \"codegraph\" or (extra == \"chromadb\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"gemini\" or extra == \"deepeval\" or extra == \"milvus\") and python_version < \"3.11\" or (python_version == \"3.12\" or extra == \"gemini\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"deepeval\" or extra == \"milvus\" or extra == \"chromadb\") and (extra == \"codegraph\" or extra == \"chromadb\" or extra == \"gemini\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"deepeval\" or extra == \"milvus\") and python_version >= \"3.12\" or python_version == \"3.11\" and (extra == \"codegraph\" or extra == \"chromadb\" or extra == \"gemini\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"deepeval\" or extra == \"milvus\")"
|
||||
markers = "python_version == \"3.10\" and extra == \"codegraph\" or (extra == \"chromadb\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"gemini\" or extra == \"deepeval\" or extra == \"milvus\") and python_version < \"3.11\" or python_version == \"3.11\" and (extra == \"codegraph\" or extra == \"chromadb\" or extra == \"gemini\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"deepeval\" or extra == \"milvus\") or (python_version == \"3.12\" or extra == \"gemini\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"deepeval\" or extra == \"milvus\" or extra == \"chromadb\") and (extra == \"codegraph\" or extra == \"chromadb\" or extra == \"gemini\" or extra == \"weaviate\" or extra == \"qdrant\" or extra == \"deepeval\" or extra == \"milvus\") and python_version >= \"3.12\""
|
||||
files = [
|
||||
{file = "protobuf-5.29.4-cp310-abi3-win32.whl", hash = "sha256:13eb236f8eb9ec34e63fc8b1d6efd2777d062fa6aaa68268fb67cf77f6839ad7"},
|
||||
{file = "protobuf-5.29.4-cp310-abi3-win_amd64.whl", hash = "sha256:bcefcdf3976233f8a502d265eb65ea740c989bacc6c30a58290ed0e519eb4b8d"},
|
||||
|
|
@ -7785,54 +7792,67 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "pyarrow"
|
||||
version = "19.0.1"
|
||||
version = "20.0.0"
|
||||
description = "Python library for Apache Arrow"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:fc28912a2dc924dddc2087679cc8b7263accc71b9ff025a1362b004711661a69"},
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-macosx_12_0_x86_64.whl", hash = "sha256:fca15aabbe9b8355800d923cc2e82c8ef514af321e18b437c3d782aa884eaeec"},
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ad76aef7f5f7e4a757fddcdcf010a8290958f09e3470ea458c80d26f4316ae89"},
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d03c9d6f2a3dffbd62671ca070f13fc527bb1867b4ec2b98c7eeed381d4f389a"},
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:65cf9feebab489b19cdfcfe4aa82f62147218558d8d3f0fc1e9dea0ab8e7905a"},
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:41f9706fbe505e0abc10e84bf3a906a1338905cbbcf1177b71486b03e6ea6608"},
|
||||
{file = "pyarrow-19.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:c6cb2335a411b713fdf1e82a752162f72d4a7b5dbc588e32aa18383318b05866"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:cc55d71898ea30dc95900297d191377caba257612f384207fe9f8293b5850f90"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:7a544ec12de66769612b2d6988c36adc96fb9767ecc8ee0a4d270b10b1c51e00"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0148bb4fc158bfbc3d6dfe5001d93ebeed253793fff4435167f6ce1dc4bddeae"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f24faab6ed18f216a37870d8c5623f9c044566d75ec586ef884e13a02a9d62c5"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:4982f8e2b7afd6dae8608d70ba5bd91699077323f812a0448d8b7abdff6cb5d3"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:49a3aecb62c1be1d822f8bf629226d4a96418228a42f5b40835c1f10d42e4db6"},
|
||||
{file = "pyarrow-19.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:008a4009efdb4ea3d2e18f05cd31f9d43c388aad29c636112c2966605ba33466"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:80b2ad2b193e7d19e81008a96e313fbd53157945c7be9ac65f44f8937a55427b"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:ee8dec072569f43835932a3b10c55973593abc00936c202707a4ad06af7cb294"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d5d1ec7ec5324b98887bdc006f4d2ce534e10e60f7ad995e7875ffa0ff9cb14"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f3ad4c0eb4e2a9aeb990af6c09e6fa0b195c8c0e7b272ecc8d4d2b6574809d34"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:d383591f3dcbe545f6cc62daaef9c7cdfe0dff0fb9e1c8121101cabe9098cfa6"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:b4c4156a625f1e35d6c0b2132635a237708944eb41df5fbe7d50f20d20c17832"},
|
||||
{file = "pyarrow-19.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:5bd1618ae5e5476b7654c7b55a6364ae87686d4724538c24185bbb2952679960"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:e45274b20e524ae5c39d7fc1ca2aa923aab494776d2d4b316b49ec7572ca324c"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:d9dedeaf19097a143ed6da37f04f4051aba353c95ef507764d344229b2b740ae"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ebfb5171bb5f4a52319344ebbbecc731af3f021e49318c74f33d520d31ae0c4"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2a21d39fbdb948857f67eacb5bbaaf36802de044ec36fbef7a1c8f0dd3a4ab2"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:99bc1bec6d234359743b01e70d4310d0ab240c3d6b0da7e2a93663b0158616f6"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:1b93ef2c93e77c442c979b0d596af45e4665d8b96da598db145b0fec014b9136"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:d9d46e06846a41ba906ab25302cf0fd522f81aa2a85a71021826f34639ad31ef"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:c0fe3dbbf054a00d1f162fda94ce236a899ca01123a798c561ba307ca38af5f0"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313t-macosx_12_0_x86_64.whl", hash = "sha256:96606c3ba57944d128e8a8399da4812f56c7f61de8c647e3470b417f795d0ef9"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f04d49a6b64cf24719c080b3c2029a3a5b16417fd5fd7c4041f94233af732f3"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5a9137cf7e1640dce4c190551ee69d478f7121b5c6f323553b319cac936395f6"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:7c1bca1897c28013db5e4c83944a2ab53231f541b9e0c3f4791206d0c0de389a"},
|
||||
{file = "pyarrow-19.0.1-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:58d9397b2e273ef76264b45531e9d552d8ec8a6688b7390b5be44c02a37aade8"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:b9766a47a9cb56fefe95cb27f535038b5a195707a08bf61b180e642324963b46"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-macosx_12_0_x86_64.whl", hash = "sha256:6c5941c1aac89a6c2f2b16cd64fe76bcdb94b2b1e99ca6459de4e6f07638d755"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fd44d66093a239358d07c42a91eebf5015aa54fccba959db899f932218ac9cc8"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:335d170e050bcc7da867a1ed8ffb8b44c57aaa6e0843b156a501298657b1e972"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:1c7556165bd38cf0cd992df2636f8bcdd2d4b26916c6b7e646101aff3c16f76f"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:699799f9c80bebcf1da0983ba86d7f289c5a2a5c04b945e2f2bcf7e874a91911"},
|
||||
{file = "pyarrow-19.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:8464c9fbe6d94a7fe1599e7e8965f350fd233532868232ab2596a71586c5a429"},
|
||||
{file = "pyarrow-19.0.1.tar.gz", hash = "sha256:3bf266b485df66a400f282ac0b6d1b500b9d2ae73314a153dbe97d6d5cc8a99e"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:c7dd06fd7d7b410ca5dc839cc9d485d2bc4ae5240851bcd45d85105cc90a47d7"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-macosx_12_0_x86_64.whl", hash = "sha256:d5382de8dc34c943249b01c19110783d0d64b207167c728461add1ecc2db88e4"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6415a0d0174487456ddc9beaead703d0ded5966129fa4fd3114d76b5d1c5ceae"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15aa1b3b2587e74328a730457068dc6c89e6dcbf438d4369f572af9d320a25ee"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:5605919fbe67a7948c1f03b9f3727d82846c053cd2ce9303ace791855923fd20"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a5704f29a74b81673d266e5ec1fe376f060627c2e42c5c7651288ed4b0db29e9"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:00138f79ee1b5aca81e2bdedb91e3739b987245e11fa3c826f9e57c5d102fb75"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f2d67ac28f57a362f1a2c1e6fa98bfe2f03230f7e15927aecd067433b1e70ce8"},
|
||||
{file = "pyarrow-20.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:4a8b029a07956b8d7bd742ffca25374dd3f634b35e46cc7a7c3fa4c75b297191"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:24ca380585444cb2a31324c546a9a56abbe87e26069189e14bdba19c86c049f0"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:95b330059ddfdc591a3225f2d272123be26c8fa76e8c9ee1a77aad507361cfdb"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f0fb1041267e9968c6d0d2ce3ff92e3928b243e2b6d11eeb84d9ac547308232"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b8ff87cc837601532cc8242d2f7e09b4e02404de1b797aee747dd4ba4bd6313f"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:7a3a5dcf54286e6141d5114522cf31dd67a9e7c9133d150799f30ee302a7a1ab"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:a6ad3e7758ecf559900261a4df985662df54fb7fdb55e8e3b3aa99b23d526b62"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:6bb830757103a6cb300a04610e08d9636f0cd223d32f388418ea893a3e655f1c"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:96e37f0766ecb4514a899d9a3554fadda770fb57ddf42b63d80f14bc20aa7db3"},
|
||||
{file = "pyarrow-20.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:3346babb516f4b6fd790da99b98bed9708e3f02e734c84971faccb20736848dc"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:75a51a5b0eef32727a247707d4755322cb970be7e935172b6a3a9f9ae98404ba"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:211d5e84cecc640c7a3ab900f930aaff5cd2702177e0d562d426fb7c4f737781"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ba3cf4182828be7a896cbd232aa8dd6a31bd1f9e32776cc3796c012855e1199"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c3a01f313ffe27ac4126f4c2e5ea0f36a5fc6ab51f8726cf41fee4b256680bd"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:a2791f69ad72addd33510fec7bb14ee06c2a448e06b649e264c094c5b5f7ce28"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:4250e28a22302ce8692d3a0e8ec9d9dde54ec00d237cff4dfa9c1fbf79e472a8"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:89e030dc58fc760e4010148e6ff164d2f44441490280ef1e97a542375e41058e"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6102b4864d77102dbbb72965618e204e550135a940c2534711d5ffa787df2a5a"},
|
||||
{file = "pyarrow-20.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:96d6a0a37d9c98be08f5ed6a10831d88d52cac7b13f5287f1e0f625a0de8062b"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:a15532e77b94c61efadde86d10957950392999503b3616b2ffcef7621a002893"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:dd43f58037443af715f34f1322c782ec463a3c8a94a85fdb2d987ceb5658e061"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa0d288143a8585806e3cc7c39566407aab646fb9ece164609dac1cfff45f6ae"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b6953f0114f8d6f3d905d98e987d0924dabce59c3cda380bdfaa25a6201563b4"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:991f85b48a8a5e839b2128590ce07611fae48a904cae6cab1f089c5955b57eb5"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:97c8dc984ed09cb07d618d57d8d4b67a5100a30c3818c2fb0b04599f0da2de7b"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9b71daf534f4745818f96c214dbc1e6124d7daf059167330b610fc69b6f3d3e3"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e8b88758f9303fa5a83d6c90e176714b2fd3852e776fc2d7e42a22dd6c2fb368"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:30b3051b7975801c1e1d387e17c588d8ab05ced9b1e14eec57915f79869b5031"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:ca151afa4f9b7bc45bcc791eb9a89e90a9eb2772767d0b1e5389609c7d03db63"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-macosx_12_0_x86_64.whl", hash = "sha256:4680f01ecd86e0dd63e39eb5cd59ef9ff24a9d166db328679e36c108dc993d4c"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7f4c8534e2ff059765647aa69b75d6543f9fef59e2cd4c6d18015192565d2b70"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3e1f8a47f4b4ae4c69c4d702cfbdfe4d41e18e5c7ef6f1bb1c50918c1e81c57b"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:a1f60dc14658efaa927f8214734f6a01a806d7690be4b3232ba526836d216122"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:204a846dca751428991346976b914d6d2a82ae5b8316a6ed99789ebf976551e6"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:f3b117b922af5e4c6b9a9115825726cac7d8b1421c37c2b5e24fbacc8930612c"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e724a3fd23ae5b9c010e7be857f4405ed5e679db5c93e66204db1a69f733936a"},
|
||||
{file = "pyarrow-20.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:82f1ee5133bd8f49d31be1299dc07f585136679666b502540db854968576faf9"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:1bcbe471ef3349be7714261dea28fe280db574f9d0f77eeccc195a2d161fd861"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-macosx_12_0_x86_64.whl", hash = "sha256:a18a14baef7d7ae49247e75641fd8bcbb39f44ed49a9fc4ec2f65d5031aa3b96"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb497649e505dc36542d0e68eca1a3c94ecbe9799cb67b578b55f2441a247fbc"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:11529a2283cb1f6271d7c23e4a8f9f8b7fd173f7360776b668e509d712a02eec"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:6fc1499ed3b4b57ee4e090e1cea6eb3584793fe3d1b4297bbf53f09b434991a5"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:db53390eaf8a4dab4dbd6d93c85c5cf002db24902dbff0ca7d988beb5c9dd15b"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:851c6a8260ad387caf82d2bbf54759130534723e37083111d4ed481cb253cc0d"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:e22f80b97a271f0a7d9cd07394a7d348f80d3ac63ed7cc38b6d1b696ab3b2619"},
|
||||
{file = "pyarrow-20.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:9965a050048ab02409fb7cbbefeedba04d3d67f2cc899eff505cc084345959ca"},
|
||||
{file = "pyarrow-20.0.0.tar.gz", hash = "sha256:febc4a913592573c8d5805091a6c2b5064c8bd6e002131f01061797d91c783c1"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
|
|
@ -8120,8 +8140,8 @@ astroid = ">=3.3.8,<=3.4.0.dev0"
|
|||
colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
|
||||
dill = [
|
||||
{version = ">=0.2", markers = "python_version < \"3.11\""},
|
||||
{version = ">=0.3.6", markers = "python_version >= \"3.11\""},
|
||||
{version = ">=0.3.7", markers = "python_version >= \"3.12\""},
|
||||
{version = ">=0.3.6", markers = "python_version == \"3.11\""},
|
||||
]
|
||||
isort = ">=4.2.5,<5.13 || >5.13,<7"
|
||||
mccabe = ">=0.6,<0.8"
|
||||
|
|
@ -8135,15 +8155,15 @@ testutils = ["gitpython (>3)"]
|
|||
|
||||
[[package]]
|
||||
name = "pymdown-extensions"
|
||||
version = "10.14.3"
|
||||
version = "10.15"
|
||||
description = "Extension pack for Python Markdown."
|
||||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"dev\""
|
||||
files = [
|
||||
{file = "pymdown_extensions-10.14.3-py3-none-any.whl", hash = "sha256:05e0bee73d64b9c71a4ae17c72abc2f700e8bc8403755a00580b49a4e9f189e9"},
|
||||
{file = "pymdown_extensions-10.14.3.tar.gz", hash = "sha256:41e576ce3f5d650be59e900e4ceff231e0aed2a88cf30acaee41e02f063a061b"},
|
||||
{file = "pymdown_extensions-10.15-py3-none-any.whl", hash = "sha256:46e99bb272612b0de3b7e7caf6da8dd5f4ca5212c0b273feb9304e236c484e5f"},
|
||||
{file = "pymdown_extensions-10.15.tar.gz", hash = "sha256:0e5994e32155f4b03504f939e501b981d306daf7ec2aa1cd2eb6bd300784f8f7"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -8155,15 +8175,15 @@ extra = ["pygments (>=2.19.1)"]
|
|||
|
||||
[[package]]
|
||||
name = "pymilvus"
|
||||
version = "2.5.7"
|
||||
version = "2.5.8"
|
||||
description = "Python Sdk for Milvus"
|
||||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"milvus\""
|
||||
files = [
|
||||
{file = "pymilvus-2.5.7-py3-none-any.whl", hash = "sha256:91373cb1a9576ceccd422182e50bdd29f7cc9228447bdf73e22c10bd225536d3"},
|
||||
{file = "pymilvus-2.5.7.tar.gz", hash = "sha256:4c092a01d847eb704b122625261e9db167c050b4a383d1259f1988bf6287dcf3"},
|
||||
{file = "pymilvus-2.5.8-py3-none-any.whl", hash = "sha256:6f33c9e78c041373df6a94724c90ca83448fd231aa33d6298a7a84ed2a5a0236"},
|
||||
{file = "pymilvus-2.5.8.tar.gz", hash = "sha256:48923e7efeebcc366d32b644772796f60484e0ca1a5afc1606d21a10ed98133c"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -8250,7 +8270,7 @@ description = "A python implementation of GNU readline."
|
|||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "sys_platform == \"win32\" and (python_version == \"3.10\" or extra == \"chromadb\" or extra == \"codegraph\") and (extra == \"chromadb\" or python_version == \"3.12\" or python_version == \"3.10\" or python_version == \"3.11\") and (extra == \"codegraph\" or extra == \"chromadb\")"
|
||||
markers = "sys_platform == \"win32\" and python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or sys_platform == \"win32\" and extra == \"chromadb\" or sys_platform == \"win32\" and (extra == \"chromadb\" or extra == \"codegraph\") and python_version == \"3.11\" or sys_platform == \"win32\" and (extra == \"chromadb\" or extra == \"codegraph\") and python_version == \"3.12\""
|
||||
files = [
|
||||
{file = "pyreadline3-3.5.4-py3-none-any.whl", hash = "sha256:eaf8e6cc3c49bcccf145fc6067ba8643d1df34d604a1ec0eccbf7a18e6d3fae6"},
|
||||
{file = "pyreadline3-3.5.4.tar.gz", hash = "sha256:8d57d53039a1c75adba8e50dd3d992b28143480816187ea5efbd5c78e6c885b7"},
|
||||
|
|
@ -8377,6 +8397,26 @@ pytest = ">=7.0.0"
|
|||
docs = ["sphinx (>=5.3)", "sphinx-rtd-theme (>=1.0)"]
|
||||
testing = ["coverage (>=6.2)", "flaky (>=3.5.0)", "hypothesis (>=5.7.1)", "mypy (>=0.931)", "pytest-trio (>=0.7.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-cov"
|
||||
version = "6.1.1"
|
||||
description = "Pytest plugin for measuring coverage."
|
||||
optional = true
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"dev\""
|
||||
files = [
|
||||
{file = "pytest_cov-6.1.1-py3-none-any.whl", hash = "sha256:bddf29ed2d0ab6f4df17b4c55b0a657287db8684af9c42ea546b21b1041b3dde"},
|
||||
{file = "pytest_cov-6.1.1.tar.gz", hash = "sha256:46935f7aaefba760e716c2ebfbe1c216240b9592966e7da99ea8292d4d3e2a0a"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
coverage = {version = ">=7.5", extras = ["toml"]}
|
||||
pytest = ">=4.6"
|
||||
|
||||
[package.extras]
|
||||
testing = ["fields", "hunter", "process-tests", "pytest-xdist", "virtualenv"]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-repeat"
|
||||
version = "0.9.4"
|
||||
|
|
@ -8870,15 +8910,15 @@ fastembed-gpu = ["fastembed-gpu (==0.3.6) ; python_version < \"3.13\""]
|
|||
|
||||
[[package]]
|
||||
name = "qdrant-client"
|
||||
version = "1.14.1"
|
||||
version = "1.14.2"
|
||||
description = "Client library for the Qdrant vector search engine"
|
||||
optional = true
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
markers = "python_version < \"3.13\" and extra == \"qdrant\""
|
||||
files = [
|
||||
{file = "qdrant_client-1.14.1-py3-none-any.whl", hash = "sha256:1c4d5ed791873698da8b5df68df16bb203ec1b0cd6cec0fd6002572a06291a1b"},
|
||||
{file = "qdrant_client-1.14.1.tar.gz", hash = "sha256:75352057ea59fdd7987313dc9cef4d83953591d083028d94eac99cd0e5e2f607"},
|
||||
{file = "qdrant_client-1.14.2-py3-none-any.whl", hash = "sha256:7c283b1f0e71db9c21b85d898fb395791caca2a6d56ee751da96d797b001410c"},
|
||||
{file = "qdrant_client-1.14.2.tar.gz", hash = "sha256:da5cab4d367d099d1330b6f30d45aefc8bd76f8b8f9d8fa5d4f813501b93af0d"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -9464,31 +9504,31 @@ pyasn1 = ">=0.1.3"
|
|||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.11.6"
|
||||
version = "0.11.7"
|
||||
description = "An extremely fast Python linter and code formatter, written in Rust."
|
||||
optional = true
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
markers = "extra == \"dev\""
|
||||
files = [
|
||||
{file = "ruff-0.11.6-py3-none-linux_armv6l.whl", hash = "sha256:d84dcbe74cf9356d1bdb4a78cf74fd47c740bf7bdeb7529068f69b08272239a1"},
|
||||
{file = "ruff-0.11.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:9bc583628e1096148011a5d51ff3c836f51899e61112e03e5f2b1573a9b726de"},
|
||||
{file = "ruff-0.11.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:f2959049faeb5ba5e3b378709e9d1bf0cab06528b306b9dd6ebd2a312127964a"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63c5d4e30d9d0de7fedbfb3e9e20d134b73a30c1e74b596f40f0629d5c28a193"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:26a4b9a4e1439f7d0a091c6763a100cef8fbdc10d68593df6f3cfa5abdd9246e"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b5edf270223dd622218256569636dc3e708c2cb989242262fe378609eccf1308"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:f55844e818206a9dd31ff27f91385afb538067e2dc0beb05f82c293ab84f7d55"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d8f782286c5ff562e4e00344f954b9320026d8e3fae2ba9e6948443fafd9ffc"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:01c63ba219514271cee955cd0adc26a4083df1956d57847978383b0e50ffd7d2"},
|
||||
{file = "ruff-0.11.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15adac20ef2ca296dd3d8e2bedc6202ea6de81c091a74661c3666e5c4c223ff6"},
|
||||
{file = "ruff-0.11.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:4dd6b09e98144ad7aec026f5588e493c65057d1b387dd937d7787baa531d9bc2"},
|
||||
{file = "ruff-0.11.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:45b2e1d6c0eed89c248d024ea95074d0e09988d8e7b1dad8d3ab9a67017a5b03"},
|
||||
{file = "ruff-0.11.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:bd40de4115b2ec4850302f1a1d8067f42e70b4990b68838ccb9ccd9f110c5e8b"},
|
||||
{file = "ruff-0.11.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:77cda2dfbac1ab73aef5e514c4cbfc4ec1fbef4b84a44c736cc26f61b3814cd9"},
|
||||
{file = "ruff-0.11.6-py3-none-win32.whl", hash = "sha256:5151a871554be3036cd6e51d0ec6eef56334d74dfe1702de717a995ee3d5b287"},
|
||||
{file = "ruff-0.11.6-py3-none-win_amd64.whl", hash = "sha256:cce85721d09c51f3b782c331b0abd07e9d7d5f775840379c640606d3159cae0e"},
|
||||
{file = "ruff-0.11.6-py3-none-win_arm64.whl", hash = "sha256:3567ba0d07fb170b1b48d944715e3294b77f5b7679e8ba258199a250383ccb79"},
|
||||
{file = "ruff-0.11.6.tar.gz", hash = "sha256:bec8bcc3ac228a45ccc811e45f7eb61b950dbf4cf31a67fa89352574b01c7d79"},
|
||||
{file = "ruff-0.11.7-py3-none-linux_armv6l.whl", hash = "sha256:d29e909d9a8d02f928d72ab7837b5cbc450a5bdf578ab9ebee3263d0a525091c"},
|
||||
{file = "ruff-0.11.7-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:dd1fb86b168ae349fb01dd497d83537b2c5541fe0626e70c786427dd8363aaee"},
|
||||
{file = "ruff-0.11.7-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d3d7d2e140a6fbbc09033bce65bd7ea29d6a0adeb90b8430262fbacd58c38ada"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4809df77de390a1c2077d9b7945d82f44b95d19ceccf0c287c56e4dc9b91ca64"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f3a0c2e169e6b545f8e2dba185eabbd9db4f08880032e75aa0e285a6d3f48201"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:49b888200a320dd96a68e86736cf531d6afba03e4f6cf098401406a257fcf3d6"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:2b19cdb9cf7dae00d5ee2e7c013540cdc3b31c4f281f1dacb5a799d610e90db4"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:64e0ee994c9e326b43539d133a36a455dbaab477bc84fe7bfbd528abe2f05c1e"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bad82052311479a5865f52c76ecee5d468a58ba44fb23ee15079f17dd4c8fd63"},
|
||||
{file = "ruff-0.11.7-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7940665e74e7b65d427b82bffc1e46710ec7f30d58b4b2d5016e3f0321436502"},
|
||||
{file = "ruff-0.11.7-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:169027e31c52c0e36c44ae9a9c7db35e505fee0b39f8d9fca7274a6305295a92"},
|
||||
{file = "ruff-0.11.7-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:305b93f9798aee582e91e34437810439acb28b5fc1fee6b8205c78c806845a94"},
|
||||
{file = "ruff-0.11.7-py3-none-musllinux_1_2_i686.whl", hash = "sha256:a681db041ef55550c371f9cd52a3cf17a0da4c75d6bd691092dfc38170ebc4b6"},
|
||||
{file = "ruff-0.11.7-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:07f1496ad00a4a139f4de220b0c97da6d4c85e0e4aa9b2624167b7d4d44fd6b6"},
|
||||
{file = "ruff-0.11.7-py3-none-win32.whl", hash = "sha256:f25dfb853ad217e6e5f1924ae8a5b3f6709051a13e9dad18690de6c8ff299e26"},
|
||||
{file = "ruff-0.11.7-py3-none-win_amd64.whl", hash = "sha256:0a931d85959ceb77e92aea4bbedfded0a31534ce191252721128f77e5ae1f98a"},
|
||||
{file = "ruff-0.11.7-py3-none-win_arm64.whl", hash = "sha256:778c1e5d6f9e91034142dfd06110534ca13220bfaad5c3735f6cb844654f6177"},
|
||||
{file = "ruff-0.11.7.tar.gz", hash = "sha256:655089ad3224070736dc32844fde783454f8558e71f501cb207485fe4eee23d4"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
|
|
@ -9859,14 +9899,14 @@ unleash = ["UnleashClient (>=6.0.1)"]
|
|||
|
||||
[[package]]
|
||||
name = "setuptools"
|
||||
version = "79.0.1"
|
||||
version = "80.0.0"
|
||||
description = "Easily download, build, install, upgrade, and uninstall Python packages"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "setuptools-79.0.1-py3-none-any.whl", hash = "sha256:e147c0549f27767ba362f9da434eab9c5dc0045d5304feb602a0af001089fc51"},
|
||||
{file = "setuptools-79.0.1.tar.gz", hash = "sha256:128ce7b8f33c3079fd1b067ecbb4051a66e8526e7b65f6cec075dfc650ddfa88"},
|
||||
{file = "setuptools-80.0.0-py3-none-any.whl", hash = "sha256:a38f898dcd6e5380f4da4381a87ec90bd0a7eec23d204a5552e80ee3cab6bd27"},
|
||||
{file = "setuptools-80.0.0.tar.gz", hash = "sha256:c40a5b3729d58dd749c0f08f1a07d134fb8a0a3d7f87dc33e7c5e1f762138650"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
|
|
@ -10173,14 +10213,14 @@ sqlcipher = ["sqlcipher3_binary"]
|
|||
|
||||
[[package]]
|
||||
name = "sqlglot"
|
||||
version = "26.16.1"
|
||||
version = "26.16.2"
|
||||
description = "An easily customizable SQL parser and transpiler"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "sqlglot-26.16.1-py3-none-any.whl", hash = "sha256:496cb742da55d491ae0c5b38d84e498362ad17a1eef1009d9b336b108a9ee636"},
|
||||
{file = "sqlglot-26.16.1.tar.gz", hash = "sha256:cced52b35bebb828722f2f4ae4d677d840470ef348f160945ae0ef3d4e457ef8"},
|
||||
{file = "sqlglot-26.16.2-py3-none-any.whl", hash = "sha256:0162f6c651f5786e2c0a6a1399c07967d8dfef61d9dde1858d58d4903c649ef1"},
|
||||
{file = "sqlglot-26.16.2.tar.gz", hash = "sha256:81278c5dcbc4935fe233d6d492ea2e991ba6d03c6609ac49a4d2e373cfa77898"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
|
|
@ -10253,14 +10293,14 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "structlog"
|
||||
version = "25.2.0"
|
||||
version = "25.3.0"
|
||||
description = "Structured Logging for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "structlog-25.2.0-py3-none-any.whl", hash = "sha256:0fecea2e345d5d491b72f3db2e5fcd6393abfc8cd06a4851f21fcd4d1a99f437"},
|
||||
{file = "structlog-25.2.0.tar.gz", hash = "sha256:d9f9776944207d1035b8b26072b9b140c63702fd7aa57c2f85d28ab701bd8e92"},
|
||||
{file = "structlog-25.3.0-py3-none-any.whl", hash = "sha256:a341f5524004c158498c3127eecded091eb67d3a611e7a3093deca30db06e172"},
|
||||
{file = "structlog-25.3.0.tar.gz", hash = "sha256:8dab497e6f6ca962abad0c283c46744185e0c9ba900db52a423cb6db99f7abeb"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -10274,15 +10314,15 @@ typing = ["mypy (>=1.4)", "rich", "twisted"]
|
|||
|
||||
[[package]]
|
||||
name = "sympy"
|
||||
version = "1.13.3"
|
||||
version = "1.14.0"
|
||||
description = "Computer algebra system (CAS) in Python"
|
||||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
markers = "python_version == \"3.10\" and (extra == \"chromadb\" or extra == \"codegraph\") or extra == \"chromadb\" or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
markers = "python_version == \"3.10\" and (extra == \"codegraph\" or extra == \"chromadb\") or extra == \"chromadb\" or python_version == \"3.11\" and (extra == \"chromadb\" or extra == \"codegraph\") or python_version == \"3.12\" and (extra == \"chromadb\" or extra == \"codegraph\")"
|
||||
files = [
|
||||
{file = "sympy-1.13.3-py3-none-any.whl", hash = "sha256:54612cf55a62755ee71824ce692986f23c88ffa77207b30c1368eda4a7060f73"},
|
||||
{file = "sympy-1.13.3.tar.gz", hash = "sha256:b27fd2c6530e0ab39e275fc9b683895367e51d5da91baa8d3d64db2565fec4d9"},
|
||||
{file = "sympy-1.14.0-py3-none-any.whl", hash = "sha256:e091cc3e99d2141a0ba2847328f5479b05d94a6635cb96148ccb3f34671bd8f5"},
|
||||
{file = "sympy-1.14.0.tar.gz", hash = "sha256:d3d3fe8df1e5a0b42f0e7bdf50541697dbe7d23746e894990c030e2b05e72517"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -10466,7 +10506,7 @@ description = "A lil' TOML parser"
|
|||
optional = true
|
||||
python-versions = ">=3.8"
|
||||
groups = ["main"]
|
||||
markers = "python_version < \"3.11\" and (extra == \"notebook\" or extra == \"dev\" or extra == \"deepeval\")"
|
||||
markers = "python_version < \"3.11\" and (extra == \"dev\" or extra == \"notebook\" or extra == \"deepeval\")"
|
||||
files = [
|
||||
{file = "tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249"},
|
||||
{file = "tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6"},
|
||||
|
|
@ -10766,14 +10806,14 @@ urllib3 = ">=1.26.0"
|
|||
|
||||
[[package]]
|
||||
name = "typer"
|
||||
version = "0.15.2"
|
||||
version = "0.15.3"
|
||||
description = "Typer, build great CLIs. Easy to code. Based on Python type hints."
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "typer-0.15.2-py3-none-any.whl", hash = "sha256:46a499c6107d645a9c13f7ee46c5d5096cae6f5fc57dd11eccbbb9ae3e44ddfc"},
|
||||
{file = "typer-0.15.2.tar.gz", hash = "sha256:ab2fab47533a813c49fe1f16b1a370fd5819099c00b119e0633df65f22144ba5"},
|
||||
{file = "typer-0.15.3-py3-none-any.whl", hash = "sha256:c86a65ad77ca531f03de08d1b9cb67cd09ad02ddddf4b34745b5008f43b239bd"},
|
||||
{file = "typer-0.15.3.tar.gz", hash = "sha256:818873625d0569653438316567861899f7e9972f2e6e0c16dab608345ced713c"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -10797,14 +10837,14 @@ files = [
|
|||
|
||||
[[package]]
|
||||
name = "types-setuptools"
|
||||
version = "79.0.0.20250422"
|
||||
version = "80.0.0.20250429"
|
||||
description = "Typing stubs for setuptools"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
groups = ["main"]
|
||||
files = [
|
||||
{file = "types_setuptools-79.0.0.20250422-py3-none-any.whl", hash = "sha256:55238c0b18cdc08dd26c32d6d8385ca1ea59b93dde760dae96d15868b7911990"},
|
||||
{file = "types_setuptools-79.0.0.20250422.tar.gz", hash = "sha256:9c9f699a5914d2ed97f02ee749fb2c7bc2898f8dad03b5dd74b74d4f80e29972"},
|
||||
{file = "types_setuptools-80.0.0.20250429-py3-none-any.whl", hash = "sha256:9bbcdcea88d1fda4b0f1371488333f606c78a8b10154a42530ed926ecf3242cb"},
|
||||
{file = "types_setuptools-80.0.0.20250429.tar.gz", hash = "sha256:a4de44f1110f531e7f9453d72999437a1caa6052609e2c7c859dd6613ab0d593"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
|
|
@ -11947,7 +11987,7 @@ api = ["gunicorn", "uvicorn"]
|
|||
chromadb = ["chromadb", "pypika"]
|
||||
codegraph = ["fastembed", "transformers", "tree-sitter", "tree-sitter-python"]
|
||||
deepeval = ["deepeval"]
|
||||
dev = ["coverage", "debugpy", "deptry", "gitpython", "mkdocs-material", "mkdocs-minify-plugin", "mkdocstrings", "mypy", "notebook", "pylance", "pylint", "pytest", "pytest-asyncio", "ruff", "tweepy"]
|
||||
dev = ["coverage", "debugpy", "deptry", "gitpython", "mkdocs-material", "mkdocs-minify-plugin", "mkdocstrings", "mypy", "notebook", "pylance", "pylint", "pytest", "pytest-asyncio", "pytest-cov", "ruff", "tweepy"]
|
||||
docs = ["unstructured"]
|
||||
evals = ["gdown", "plotly"]
|
||||
falkordb = ["falkordb"]
|
||||
|
|
@ -11972,4 +12012,4 @@ weaviate = ["weaviate-client"]
|
|||
[metadata]
|
||||
lock-version = "2.1"
|
||||
python-versions = ">=3.10,<=3.13"
|
||||
content-hash = "8ef9af57a2718509290e2c893a8c03df16c11af92a644876c9271a1e97b32ec1"
|
||||
content-hash = "7a300700800488853f060001d915b84d108461a6a39ec8299dcb6390fa78ea41"
|
||||
|
|
|
|||
|
|
@ -27,7 +27,10 @@ dependencies = [
|
|||
"nltk==3.9.1",
|
||||
"numpy>=1.26.4, <=2.1",
|
||||
"pandas==2.2.3",
|
||||
"boto3>=1.26.125,<2",
|
||||
# Note: New s3fs and boto3 versions don't work well together
|
||||
# Always use comaptible fixed versions of these two dependencies
|
||||
"s3fs==2025.3.2",
|
||||
"boto3==1.37.1",
|
||||
"botocore>=1.35.54,<2",
|
||||
"sqlalchemy==2.0.39",
|
||||
"aiosqlite>=0.20.0,<0.21",
|
||||
|
|
@ -55,7 +58,6 @@ dependencies = [
|
|||
"dlt[sqlalchemy]>=1.9.0,<2",
|
||||
"sentry-sdk[fastapi]>=2.9.0,<3",
|
||||
"structlog>=25.2.0,<26",
|
||||
"s3fs>=2025.3.2,<2026",
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
|
|
@ -110,6 +112,7 @@ gui = [
|
|||
graphiti = ["graphiti-core>=0.7.0,<0.8"]
|
||||
dev = [
|
||||
"pytest>=7.4.0,<8",
|
||||
"pytest-cov>=6.1.1",
|
||||
"pytest-asyncio>=0.21.1,<0.22",
|
||||
"coverage>=7.3.2,<8",
|
||||
"mypy>=1.7.1,<2",
|
||||
|
|
|
|||
54
uv.lock
generated
54
uv.lock
generated
|
|
@ -932,6 +932,7 @@ dev = [
|
|||
{ name = "pylint" },
|
||||
{ name = "pytest" },
|
||||
{ name = "pytest-asyncio" },
|
||||
{ name = "pytest-cov" },
|
||||
{ name = "ruff" },
|
||||
{ name = "tweepy" },
|
||||
]
|
||||
|
|
@ -1009,7 +1010,7 @@ requires-dist = [
|
|||
{ name = "alembic", specifier = ">=1.13.3,<2" },
|
||||
{ name = "anthropic", marker = "extra == 'anthropic'", specifier = ">=0.26.1,<0.27" },
|
||||
{ name = "asyncpg", marker = "extra == 'postgres'", specifier = "==0.30.0" },
|
||||
{ name = "boto3", specifier = ">=1.26.125,<2" },
|
||||
{ name = "boto3", specifier = "==1.37.1" },
|
||||
{ name = "botocore", specifier = ">=1.35.54,<2" },
|
||||
{ name = "chromadb", marker = "extra == 'chromadb'", specifier = ">=0.3.0,<0.7" },
|
||||
{ name = "coverage", marker = "extra == 'dev'", specifier = ">=7.3.2,<8" },
|
||||
|
|
@ -1069,12 +1070,13 @@ requires-dist = [
|
|||
{ name = "pyside6", marker = "extra == 'gui'", specifier = ">=6.8.3,<7" },
|
||||
{ name = "pytest", marker = "extra == 'dev'", specifier = ">=7.4.0,<8" },
|
||||
{ name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.21.1,<0.22" },
|
||||
{ name = "pytest-cov", marker = "extra == 'dev'", specifier = ">=6.1.1" },
|
||||
{ name = "python-dotenv", specifier = "==1.0.1" },
|
||||
{ name = "python-multipart", specifier = "==0.0.20" },
|
||||
{ name = "qasync", marker = "extra == 'gui'", specifier = ">=0.27.1,<0.28" },
|
||||
{ name = "qdrant-client", marker = "extra == 'qdrant'", specifier = ">=1.9.0,<2" },
|
||||
{ name = "ruff", marker = "extra == 'dev'", specifier = ">=0.9.2,<1.0.0" },
|
||||
{ name = "s3fs", specifier = ">=2025.3.2,<2026" },
|
||||
{ name = "s3fs", specifier = "==2025.3.2" },
|
||||
{ name = "scikit-learn", specifier = ">=1.6.1,<2" },
|
||||
{ name = "sentry-sdk", extras = ["fastapi"], specifier = ">=2.9.0,<3" },
|
||||
{ name = "sqlalchemy", specifier = "==2.0.39" },
|
||||
|
|
@ -1252,6 +1254,11 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/59/f1/4da7717f0063a222db253e7121bd6a56f6fb1ba439dcc36659088793347c/coverage-7.8.0-py3-none-any.whl", hash = "sha256:dbf364b4c5e7bae9250528167dfe40219b62e2d573c854d74be213e1e52069f7", size = 203435 },
|
||||
]
|
||||
|
||||
[package.optional-dependencies]
|
||||
toml = [
|
||||
{ name = "tomli", marker = "python_full_version <= '3.11'" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cryptography"
|
||||
version = "44.0.2"
|
||||
|
|
@ -1810,17 +1817,17 @@ resolution-markers = [
|
|||
"python_full_version >= '3.12' and python_full_version < '3.12.4'",
|
||||
]
|
||||
dependencies = [
|
||||
{ name = "huggingface-hub", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "loguru", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "mmh3", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "numpy", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "onnx", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "onnxruntime", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "pillow", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "py-rust-stemmers", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "requests", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "tokenizers", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "tqdm", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "huggingface-hub", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "loguru", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "mmh3", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "numpy", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "onnx", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "onnxruntime", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "pillow", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "py-rust-stemmers", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "requests", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "tokenizers", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "tqdm", marker = "python_full_version == '3.12.*'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/02/f0/8d935e8ea2408ccd34405ccb61bbcc340633597841e56e621cc5c4768405/fastembed-0.4.2.tar.gz", hash = "sha256:4065344ed795c2c860f31953ab9ead91291ce77952a3f7823ae64e3c8dc1a21c", size = 42084 }
|
||||
wheels = [
|
||||
|
|
@ -3808,8 +3815,8 @@ name = "loguru"
|
|||
version = "0.7.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
{ name = "win32-setctime", marker = "sys_platform == 'win32'" },
|
||||
{ name = "colorama", marker = "python_full_version < '3.13' and sys_platform == 'win32'" },
|
||||
{ name = "win32-setctime", marker = "python_full_version < '3.13' and sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/3a/05/a1dae3dffd1116099471c643b8924f5aa6524411dc6c63fdae648c4f1aca/loguru-0.7.3.tar.gz", hash = "sha256:19480589e77d47b8d85b2c827ad95d49bf31b0dcde16593892eb51dd18706eb6", size = 63559 }
|
||||
wheels = [
|
||||
|
|
@ -4794,8 +4801,8 @@ name = "onnx"
|
|||
version = "1.17.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "numpy", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "protobuf", marker = "python_full_version >= '3.12'" },
|
||||
{ name = "numpy", marker = "python_full_version == '3.12.*'" },
|
||||
{ name = "protobuf", marker = "python_full_version == '3.12.*'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/9a/54/0e385c26bf230d223810a9c7d06628d954008a5e5e4b73ee26ef02327282/onnx-1.17.0.tar.gz", hash = "sha256:48ca1a91ff73c1d5e3ea2eef20ae5d0e709bb8a2355ed798ffc2169753013fd3", size = 12165120 }
|
||||
wheels = [
|
||||
|
|
@ -6075,6 +6082,19 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/9c/ce/1e4b53c213dce25d6e8b163697fbce2d43799d76fa08eea6ad270451c370/pytest_asyncio-0.21.2-py3-none-any.whl", hash = "sha256:ab664c88bb7998f711d8039cacd4884da6430886ae8bbd4eded552ed2004f16b", size = 13368 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-cov"
|
||||
version = "6.1.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "coverage", extra = ["toml"] },
|
||||
{ name = "pytest" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/25/69/5f1e57f6c5a39f81411b550027bf72842c4567ff5fd572bed1edc9e4b5d9/pytest_cov-6.1.1.tar.gz", hash = "sha256:46935f7aaefba760e716c2ebfbe1c216240b9592966e7da99ea8292d4d3e2a0a", size = 66857 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/28/d0/def53b4a790cfb21483016430ed828f64830dd981ebe1089971cd10cab25/pytest_cov-6.1.1-py3-none-any.whl", hash = "sha256:bddf29ed2d0ab6f4df17b4c55b0a657287db8684af9c42ea546b21b1041b3dde", size = 23841 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-repeat"
|
||||
version = "0.9.4"
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue