Compare commits

..

1 commit

Author SHA1 Message Date
Daniel Chalef
1408129bf1
Update pyproject.toml 2025-11-14 10:30:14 -08:00
31 changed files with 522 additions and 280 deletions

View file

@ -0,0 +1,123 @@
name: Daily Issue Maintenance
on:
schedule:
- cron: "0 0 * * *" # Every day at midnight
workflow_dispatch: # Manual trigger option
jobs:
find-legacy-duplicates:
runs-on: ubuntu-latest
if: github.event_name == 'workflow_dispatch'
permissions:
contents: read
issues: write
id-token: write
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 1
- uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
prompt: |
REPO: ${{ github.repository }}
Find potential duplicate issues in the repository:
1. Use `gh issue list --state open --limit 1000 --json number,title,body,createdAt` to get all open issues
2. For each issue, search for potential duplicates using `gh search issues` with keywords from the title and body
3. Compare issues to identify true duplicates using these criteria:
- Same bug or error being reported
- Same feature request (even if worded differently)
- Same question being asked
- Issues describing the same root problem
For each duplicate found:
- Add a comment linking to the original issue
- Apply the "duplicate" label using `gh issue edit`
- Be polite and explain why it's a duplicate
Focus on finding true duplicates, not just similar issues.
claude_args: |
--allowedTools "Bash(gh issue:*),Bash(gh search:*)"
--model claude-sonnet-4-5-20250929
check-stale-issues:
runs-on: ubuntu-latest
if: github.event_name == 'schedule'
permissions:
contents: read
issues: write
id-token: write
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 1
- uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
prompt: |
REPO: ${{ github.repository }}
Review stale issues and request confirmation:
1. Use `gh issue list --state open --limit 1000 --json number,title,updatedAt,comments` to get all open issues
2. Identify issues that are:
- Older than 60 days (based on updatedAt)
- Have no comments with "stale-check" label
- Are not labeled as "enhancement" or "documentation"
3. For each stale issue:
- Add a polite comment asking the issue originator if this is still relevant
- Apply a "stale-check" label to track that we've asked
- Use format: "@{author} Is this still an issue? Please confirm within 14 days or this issue will be closed."
Use:
- `gh issue view` to check issue details and labels
- `gh issue comment` to add comments
- `gh issue edit` to add the "stale-check" label
claude_args: |
--allowedTools "Bash(gh issue:*)"
--model claude-sonnet-4-5-20250929
close-unconfirmed-issues:
runs-on: ubuntu-latest
if: github.event_name == 'schedule'
needs: check-stale-issues
permissions:
contents: read
issues: write
id-token: write
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 1
- uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
prompt: |
REPO: ${{ github.repository }}
Close unconfirmed stale issues:
1. Use `gh issue list --state open --label "stale-check" --limit 1000 --json number,title,comments,updatedAt` to get issues with stale-check label
2. For each issue, check if:
- The "stale-check" comment was added 14+ days ago
- There has been no response from the issue author or activity since the comment
3. For issues meeting the criteria:
- Add a polite closing comment
- Close the issue using `gh issue close`
- Use format: "Closing due to inactivity. Feel free to reopen if this is still relevant."
Use:
- `gh issue view` to check issue comments and activity
- `gh issue comment` to add closing comment
- `gh issue close` to close the issue
claude_args: |
--allowedTools "Bash(gh issue:*)"
--model claude-sonnet-4-5-20250929

141
.github/workflows/issue-triage.yml vendored Normal file
View file

@ -0,0 +1,141 @@
name: Issue Triage and Deduplication
on:
issues:
types: [opened]
jobs:
triage:
runs-on: ubuntu-latest
timeout-minutes: 10
permissions:
contents: read
issues: write
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Run Claude Code for Issue Triage
uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
allowed_non_write_users: "*"
github_token: ${{ secrets.GITHUB_TOKEN }}
prompt: |
You're an issue triage assistant for GitHub issues. Your task is to analyze the issue and select appropriate labels from the provided list.
IMPORTANT: Don't post any comments or messages to the issue. Your only action should be to apply labels. DO NOT check for duplicates - that's handled by a separate job.
Issue Information:
- REPO: ${{ github.repository }}
- ISSUE_NUMBER: ${{ github.event.issue.number }}
TASK OVERVIEW:
1. First, fetch the list of labels available in this repository by running: `gh label list`. Run exactly this command with nothing else.
2. Next, use gh commands to get context about the issue:
- Use `gh issue view ${{ github.event.issue.number }}` to retrieve the current issue's details
- Use `gh search issues` to find similar issues that might provide context for proper categorization
- You have access to these Bash commands:
- Bash(gh label list:*) - to get available labels
- Bash(gh issue view:*) - to view issue details
- Bash(gh issue edit:*) - to apply labels to the issue
- Bash(gh search:*) - to search for similar issues
3. Analyze the issue content, considering:
- The issue title and description
- The type of issue (bug report, feature request, question, etc.)
- Technical areas mentioned
- Database mentions (neo4j, falkordb, neptune, etc.)
- LLM providers mentioned (openai, anthropic, gemini, groq, etc.)
- Components affected (embeddings, search, prompts, server, mcp, etc.)
4. Select appropriate labels from the available labels list:
- Choose labels that accurately reflect the issue's nature
- Be specific but comprehensive
- Add database-specific labels if mentioned: neo4j, falkordb, neptune
- Add component labels if applicable
- DO NOT add priority labels (P1, P2, P3)
- DO NOT add duplicate label - that's handled by the deduplication job
5. Apply the selected labels:
- Use `gh issue edit ${{ github.event.issue.number }} --add-label "label1,label2,label3"` to apply your selected labels
- DO NOT post any comments explaining your decision
- DO NOT communicate directly with users
- If no labels are clearly applicable, do not apply any labels
IMPORTANT GUIDELINES:
- Be thorough in your analysis
- Only select labels from the provided list
- DO NOT post any comments to the issue
- Your ONLY action should be to apply labels using gh issue edit
- It's okay to not add any labels if none are clearly applicable
- DO NOT check for duplicates
claude_args: |
--allowedTools "Bash(gh label list:*),Bash(gh issue view:*),Bash(gh issue edit:*),Bash(gh search:*)"
--model claude-sonnet-4-5-20250929
deduplicate:
runs-on: ubuntu-latest
timeout-minutes: 10
needs: triage
permissions:
contents: read
issues: write
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Check for duplicate issues
uses: anthropics/claude-code-action@v1
with:
allowed_non_write_users: "*"
prompt: |
Analyze this new issue and check if it's a duplicate of existing issues in the repository.
Issue: #${{ github.event.issue.number }}
Repository: ${{ github.repository }}
Your task:
1. Use mcp__github__get_issue to get details of the current issue (#${{ github.event.issue.number }})
2. Search for similar existing OPEN issues using mcp__github__search_issues with relevant keywords from the issue title and body
3. Compare the new issue with existing ones to identify potential duplicates
Criteria for duplicates:
- Same bug or error being reported
- Same feature request (even if worded differently)
- Same question being asked
- Issues describing the same root problem
If you find duplicates:
- Add a comment on the new issue linking to the original issue(s)
- Apply the "duplicate" label to the new issue
- Be polite and explain why it's a duplicate
- Suggest the user follow the original issue for updates
If it's NOT a duplicate:
- Don't add any comments
- Don't modify labels
Use these tools:
- mcp__github__get_issue: Get issue details
- mcp__github__search_issues: Search for similar issues (use state:open)
- mcp__github__list_issues: List recent issues if needed
- mcp__github__create_issue_comment: Add a comment if duplicate found
- mcp__github__update_issue: Add "duplicate" label
Be thorough but efficient. Focus on finding true duplicates, not just similar issues.
anthropic_api_key: ${{ secrets.ANTHROPIC_API_KEY }}
claude_args: |
--allowedTools "mcp__github__get_issue,mcp__github__search_issues,mcp__github__list_issues,mcp__github__create_issue_comment,mcp__github__update_issue,mcp__github__get_issue_comments"
--model claude-sonnet-4-5-20250929

View file

@ -479,7 +479,7 @@ graphiti = Graphiti(
cross_encoder=GeminiRerankerClient( cross_encoder=GeminiRerankerClient(
config=LLMConfig( config=LLMConfig(
api_key=api_key, api_key=api_key,
model="gemini-2.5-flash-lite" model="gemini-2.5-flash-lite-preview-06-17"
) )
) )
) )
@ -487,7 +487,7 @@ graphiti = Graphiti(
# Now you can use Graphiti with Google Gemini for all components # Now you can use Graphiti with Google Gemini for all components
``` ```
The Gemini reranker uses the `gemini-2.5-flash-lite` model by default, which is optimized for The Gemini reranker uses the `gemini-2.5-flash-lite-preview-06-17` model by default, which is optimized for
cost-effective and low-latency classification tasks. It uses the same boolean classification approach as the OpenAI cost-effective and low-latency classification tasks. It uses the same boolean classification approach as the OpenAI
reranker, leveraging Gemini's log probabilities feature to rank passage relevance. reranker, leveraging Gemini's log probabilities feature to rank passage relevance.

View file

@ -34,7 +34,7 @@ You are not expected to provide support for Your Contributions, except to the ex
## Third-Party Submissions ## Third-Party Submissions
Should You wish to submit work that is not Your original creation, You may submit it to Zep separately from any Contribution, identifying the complete details of its source and of any license or other restriction (including, but not limited to, related patents, trademarks, and license agreements) of which you are personally aware, and conspicuously marking the work as "Submitted on behalf of a third party: [named here]". Should You wish to submit work that is not Your original creation, You may submit it to Zep separately from any Contribution, identifying the complete details of its source and of any license or other restriction (including, but not limited to, related patents, trademarks, and license agreements) of which you are personally aware, and conspicuously marking the work as "Submitted on behalf of a third-party: [named here]".
## Notifications ## Notifications

View file

@ -40,8 +40,8 @@ from graphiti_core.nodes import EpisodeType
# Configure logging # Configure logging
logging.basicConfig( logging.basicConfig(
level=INFO, level=INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s', format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
datefmt='%Y-%m-%d %H:%M:%S', datefmt="%Y-%m-%d %H:%M:%S",
) )
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -49,20 +49,20 @@ load_dotenv()
# Neo4j connection parameters # Neo4j connection parameters
# Make sure Neo4j Desktop is running with a local DBMS started # Make sure Neo4j Desktop is running with a local DBMS started
neo4j_uri = os.environ.get('NEO4J_URI', 'bolt://localhost:7687') neo4j_uri = os.environ.get("NEO4J_URI", "bolt://localhost:7687")
neo4j_user = os.environ.get('NEO4J_USER', 'neo4j') neo4j_user = os.environ.get("NEO4J_USER", "neo4j")
neo4j_password = os.environ.get('NEO4J_PASSWORD', 'password') neo4j_password = os.environ.get("NEO4J_PASSWORD", "password")
# Azure OpenAI connection parameters # Azure OpenAI connection parameters
azure_endpoint = os.environ.get('AZURE_OPENAI_ENDPOINT') azure_endpoint = os.environ.get("AZURE_OPENAI_ENDPOINT")
azure_api_key = os.environ.get('AZURE_OPENAI_API_KEY') azure_api_key = os.environ.get("AZURE_OPENAI_API_KEY")
azure_deployment = os.environ.get('AZURE_OPENAI_DEPLOYMENT', 'gpt-4.1') azure_deployment = os.environ.get("AZURE_OPENAI_DEPLOYMENT", "gpt-4.1")
azure_embedding_deployment = os.environ.get( azure_embedding_deployment = os.environ.get(
'AZURE_OPENAI_EMBEDDING_DEPLOYMENT', 'text-embedding-3-small' "AZURE_OPENAI_EMBEDDING_DEPLOYMENT", "text-embedding-3-small"
) )
if not azure_endpoint or not azure_api_key: if not azure_endpoint or not azure_api_key:
raise ValueError('AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_API_KEY must be set') raise ValueError("AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_API_KEY must be set")
async def main(): async def main():
@ -76,7 +76,7 @@ async def main():
# Initialize Azure OpenAI client # Initialize Azure OpenAI client
azure_client = AsyncOpenAI( azure_client = AsyncOpenAI(
base_url=f'{azure_endpoint}/openai/v1/', base_url=f"{azure_endpoint}/openai/v1/",
api_key=azure_api_key, api_key=azure_api_key,
) )
@ -112,40 +112,40 @@ async def main():
# Episodes list containing both text and JSON episodes # Episodes list containing both text and JSON episodes
episodes = [ episodes = [
{ {
'content': 'Kamala Harris is the Attorney General of California. She was previously ' "content": "Kamala Harris is the Attorney General of California. She was previously "
'the district attorney for San Francisco.', "the district attorney for San Francisco.",
'type': EpisodeType.text, "type": EpisodeType.text,
'description': 'podcast transcript', "description": "podcast transcript",
}, },
{ {
'content': 'As AG, Harris was in office from January 3, 2011 January 3, 2017', "content": "As AG, Harris was in office from January 3, 2011 January 3, 2017",
'type': EpisodeType.text, "type": EpisodeType.text,
'description': 'podcast transcript', "description": "podcast transcript",
}, },
{ {
'content': { "content": {
'name': 'Gavin Newsom', "name": "Gavin Newsom",
'position': 'Governor', "position": "Governor",
'state': 'California', "state": "California",
'previous_role': 'Lieutenant Governor', "previous_role": "Lieutenant Governor",
'previous_location': 'San Francisco', "previous_location": "San Francisco",
}, },
'type': EpisodeType.json, "type": EpisodeType.json,
'description': 'podcast metadata', "description": "podcast metadata",
}, },
] ]
# Add episodes to the graph # Add episodes to the graph
for i, episode in enumerate(episodes): for i, episode in enumerate(episodes):
await graphiti.add_episode( await graphiti.add_episode(
name=f'California Politics {i}', name=f"California Politics {i}",
episode_body=( episode_body=(
episode['content'] episode["content"]
if isinstance(episode['content'], str) if isinstance(episode["content"], str)
else json.dumps(episode['content']) else json.dumps(episode["content"])
), ),
source=episode['type'], source=episode["type"],
source_description=episode['description'], source_description=episode["description"],
reference_time=datetime.now(timezone.utc), reference_time=datetime.now(timezone.utc),
) )
print(f'Added episode: California Politics {i} ({episode["type"].value})') print(f'Added episode: California Politics {i} ({episode["type"].value})')
@ -161,18 +161,18 @@ async def main():
# Perform a hybrid search combining semantic similarity and BM25 retrieval # Perform a hybrid search combining semantic similarity and BM25 retrieval
print("\nSearching for: 'Who was the California Attorney General?'") print("\nSearching for: 'Who was the California Attorney General?'")
results = await graphiti.search('Who was the California Attorney General?') results = await graphiti.search("Who was the California Attorney General?")
# Print search results # Print search results
print('\nSearch Results:') print("\nSearch Results:")
for result in results: for result in results:
print(f'UUID: {result.uuid}') print(f"UUID: {result.uuid}")
print(f'Fact: {result.fact}') print(f"Fact: {result.fact}")
if hasattr(result, 'valid_at') and result.valid_at: if hasattr(result, "valid_at") and result.valid_at:
print(f'Valid from: {result.valid_at}') print(f"Valid from: {result.valid_at}")
if hasattr(result, 'invalid_at') and result.invalid_at: if hasattr(result, "invalid_at") and result.invalid_at:
print(f'Valid until: {result.invalid_at}') print(f"Valid until: {result.invalid_at}")
print('---') print("---")
################################################# #################################################
# CENTER NODE SEARCH # CENTER NODE SEARCH
@ -187,26 +187,26 @@ async def main():
# Get the source node UUID from the top result # Get the source node UUID from the top result
center_node_uuid = results[0].source_node_uuid center_node_uuid = results[0].source_node_uuid
print('\nReranking search results based on graph distance:') print("\nReranking search results based on graph distance:")
print(f'Using center node UUID: {center_node_uuid}') print(f"Using center node UUID: {center_node_uuid}")
reranked_results = await graphiti.search( reranked_results = await graphiti.search(
'Who was the California Attorney General?', "Who was the California Attorney General?",
center_node_uuid=center_node_uuid, center_node_uuid=center_node_uuid,
) )
# Print reranked search results # Print reranked search results
print('\nReranked Search Results:') print("\nReranked Search Results:")
for result in reranked_results: for result in reranked_results:
print(f'UUID: {result.uuid}') print(f"UUID: {result.uuid}")
print(f'Fact: {result.fact}') print(f"Fact: {result.fact}")
if hasattr(result, 'valid_at') and result.valid_at: if hasattr(result, "valid_at") and result.valid_at:
print(f'Valid from: {result.valid_at}') print(f"Valid from: {result.valid_at}")
if hasattr(result, 'invalid_at') and result.invalid_at: if hasattr(result, "invalid_at") and result.invalid_at:
print(f'Valid until: {result.invalid_at}') print(f"Valid until: {result.invalid_at}")
print('---') print("---")
else: else:
print('No results found in the initial search to use as center node.') print("No results found in the initial search to use as center node.")
finally: finally:
################################################# #################################################
@ -218,8 +218,8 @@ async def main():
# Close the connection # Close the connection
await graphiti.close() await graphiti.close()
print('\nConnection closed') print("\nConnection closed")
if __name__ == '__main__': if __name__ == "__main__":
asyncio.run(main()) asyncio.run(main())

View file

@ -540,7 +540,7 @@
"submit_button = widgets.Button(description='Send')\n", "submit_button = widgets.Button(description='Send')\n",
"submit_button.on_click(on_submit)\n", "submit_button.on_click(on_submit)\n",
"\n", "\n",
"conversation_output.append_stdout('Assistant: Hello, how can I help you find shoes today?')\n", "conversation_output.append_stdout('Asssistant: Hello, how can I help you find shoes today?')\n",
"\n", "\n",
"display(widgets.VBox([input_box, submit_button, conversation_output]))" "display(widgets.VBox([input_box, submit_button, conversation_output]))"
] ]

View file

@ -37,7 +37,7 @@ else:
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
DEFAULT_MODEL = 'gemini-2.5-flash-lite' DEFAULT_MODEL = 'gemini-2.5-flash-lite-preview-06-17'
class GeminiRerankerClient(CrossEncoderClient): class GeminiRerankerClient(CrossEncoderClient):

View file

@ -134,7 +134,7 @@ class KuzuDriver(GraphDriver):
return KuzuDriverSession(self) return KuzuDriverSession(self)
async def close(self): async def close(self):
# Do not explicitly close the connection, instead rely on GC. # Do not explicity close the connection, instead rely on GC.
pass pass
def delete_all_indexes(self, database_: str): def delete_all_indexes(self, database_: str):

View file

@ -18,9 +18,7 @@ import logging
from collections.abc import Coroutine from collections.abc import Coroutine
from typing import Any from typing import Any
import neo4j.exceptions
from neo4j import AsyncGraphDatabase, EagerResult from neo4j import AsyncGraphDatabase, EagerResult
from neo4j.exceptions import ClientError
from typing_extensions import LiteralString from typing_extensions import LiteralString
from graphiti_core.driver.driver import GraphDriver, GraphDriverSession, GraphProvider from graphiti_core.driver.driver import GraphDriver, GraphDriverSession, GraphProvider
@ -72,15 +70,6 @@ class Neo4jDriver(GraphDriver):
try: try:
result = await self.client.execute_query(cypher_query_, parameters_=params, **kwargs) result = await self.client.execute_query(cypher_query_, parameters_=params, **kwargs)
except neo4j.exceptions.ClientError as e:
# Handle race condition when creating indices/constraints in parallel
# Neo4j 5.26+ may throw EquivalentSchemaRuleAlreadyExists even with IF NOT EXISTS
if 'EquivalentSchemaRuleAlreadyExists' in str(e):
logger.info(f'Index or constraint already exists, continuing: {cypher_query_}')
# Return empty result to indicate success (index exists)
return EagerResult([], None, None) # type: ignore
logger.error(f'Error executing Neo4j query: {e}\n{cypher_query_}\n{params}')
raise
except Exception as e: except Exception as e:
logger.error(f'Error executing Neo4j query: {e}\n{cypher_query_}\n{params}') logger.error(f'Error executing Neo4j query: {e}\n{cypher_query_}\n{params}')
raise raise
@ -99,21 +88,6 @@ class Neo4jDriver(GraphDriver):
'CALL db.indexes() YIELD name DROP INDEX name', 'CALL db.indexes() YIELD name DROP INDEX name',
) )
async def _execute_index_query(self, query: LiteralString) -> EagerResult | None:
"""Execute an index creation query, ignoring 'index already exists' errors.
Neo4j can raise EquivalentSchemaRuleAlreadyExists when concurrent CREATE INDEX
IF NOT EXISTS queries race, even though the index exists. This is safe to ignore.
"""
try:
return await self.execute_query(query)
except ClientError as e:
# Ignore "equivalent index already exists" error (race condition with IF NOT EXISTS)
if 'EquivalentSchemaRuleAlreadyExists' in str(e):
logger.debug(f'Index already exists (concurrent creation): {query[:50]}...')
return None
raise
async def build_indices_and_constraints(self, delete_existing: bool = False): async def build_indices_and_constraints(self, delete_existing: bool = False):
if delete_existing: if delete_existing:
await self.delete_all_indexes() await self.delete_all_indexes()
@ -124,8 +98,15 @@ class Neo4jDriver(GraphDriver):
index_queries: list[LiteralString] = range_indices + fulltext_indices index_queries: list[LiteralString] = range_indices + fulltext_indices
await semaphore_gather(*[self._execute_index_query(query) for query in index_queries]) await semaphore_gather(
*[
self.execute_query(
query,
)
for query in index_queries
]
)
async def health_check(self) -> None: async def health_check(self) -> None:
"""Check Neo4j connectivity by running the driver's verify_connectivity method.""" """Check Neo4j connectivity by running the driver's verify_connectivity method."""
try: try:

View file

@ -33,7 +33,7 @@ class AzureOpenAIEmbedderClient(EmbedderClient):
def __init__( def __init__(
self, self,
azure_client: AsyncAzureOpenAI | AsyncOpenAI, azure_client: AsyncAzureOpenAI | AsyncOpenAI,
model: str = 'text-embedding-3-small', model: str = "text-embedding-3-small",
): ):
self.azure_client = azure_client self.azure_client = azure_client
self.model = model self.model = model
@ -44,18 +44,22 @@ class AzureOpenAIEmbedderClient(EmbedderClient):
# Handle different input types # Handle different input types
if isinstance(input_data, str): if isinstance(input_data, str):
text_input = [input_data] text_input = [input_data]
elif isinstance(input_data, list) and all(isinstance(item, str) for item in input_data): elif isinstance(input_data, list) and all(
isinstance(item, str) for item in input_data
):
text_input = input_data text_input = input_data
else: else:
# Convert to string list for other types # Convert to string list for other types
text_input = [str(input_data)] text_input = [str(input_data)]
response = await self.azure_client.embeddings.create(model=self.model, input=text_input) response = await self.azure_client.embeddings.create(
model=self.model, input=text_input
)
# Return the first embedding as a list of floats # Return the first embedding as a list of floats
return response.data[0].embedding return response.data[0].embedding
except Exception as e: except Exception as e:
logger.error(f'Error in Azure OpenAI embedding: {e}') logger.error(f"Error in Azure OpenAI embedding: {e}")
raise raise
async def create_batch(self, input_data_list: list[str]) -> list[list[float]]: async def create_batch(self, input_data_list: list[str]) -> list[list[float]]:
@ -67,5 +71,5 @@ class AzureOpenAIEmbedderClient(EmbedderClient):
return [embedding.embedding for embedding in response.data] return [embedding.embedding for embedding in response.data]
except Exception as e: except Exception as e:
logger.error(f'Error in Azure OpenAI batch embedding: {e}') logger.error(f"Error in Azure OpenAI batch embedding: {e}")
raise raise

View file

@ -47,9 +47,6 @@ else:
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
AnthropicModel = Literal[ AnthropicModel = Literal[
'claude-sonnet-4-5-latest',
'claude-sonnet-4-5-20250929',
'claude-haiku-4-5-latest',
'claude-3-7-sonnet-latest', 'claude-3-7-sonnet-latest',
'claude-3-7-sonnet-20250219', 'claude-3-7-sonnet-20250219',
'claude-3-5-haiku-latest', 'claude-3-5-haiku-latest',
@ -65,7 +62,7 @@ AnthropicModel = Literal[
'claude-2.0', 'claude-2.0',
] ]
DEFAULT_MODEL: AnthropicModel = 'claude-haiku-4-5-latest' DEFAULT_MODEL: AnthropicModel = 'claude-3-7-sonnet-latest'
# Maximum output tokens for different Anthropic models # Maximum output tokens for different Anthropic models
# Based on official Anthropic documentation (as of 2025) # Based on official Anthropic documentation (as of 2025)
@ -73,10 +70,6 @@ DEFAULT_MODEL: AnthropicModel = 'claude-haiku-4-5-latest'
# Some models support higher limits with additional configuration (e.g., Claude 3.7 supports # Some models support higher limits with additional configuration (e.g., Claude 3.7 supports
# 128K with 'anthropic-beta: output-128k-2025-02-19' header, but this is not currently implemented). # 128K with 'anthropic-beta: output-128k-2025-02-19' header, but this is not currently implemented).
ANTHROPIC_MODEL_MAX_TOKENS = { ANTHROPIC_MODEL_MAX_TOKENS = {
# Claude 4.5 models - 64K tokens
'claude-sonnet-4-5-latest': 65536,
'claude-sonnet-4-5-20250929': 65536,
'claude-haiku-4-5-latest': 65536,
# Claude 3.7 models - standard 64K tokens # Claude 3.7 models - standard 64K tokens
'claude-3-7-sonnet-latest': 65536, 'claude-3-7-sonnet-latest': 65536,
'claude-3-7-sonnet-20250219': 65536, 'claude-3-7-sonnet-20250219': 65536,

View file

@ -66,21 +66,21 @@ class AzureOpenAILLMClient(BaseOpenAIClient):
"""Create a structured completion using Azure OpenAI's responses.parse API.""" """Create a structured completion using Azure OpenAI's responses.parse API."""
supports_reasoning = self._supports_reasoning_features(model) supports_reasoning = self._supports_reasoning_features(model)
request_kwargs = { request_kwargs = {
'model': model, "model": model,
'input': messages, "input": messages,
'max_output_tokens': max_tokens, "max_output_tokens": max_tokens,
'text_format': response_model, # type: ignore "text_format": response_model, # type: ignore
} }
temperature_value = temperature if not supports_reasoning else None temperature_value = temperature if not supports_reasoning else None
if temperature_value is not None: if temperature_value is not None:
request_kwargs['temperature'] = temperature_value request_kwargs["temperature"] = temperature_value
if supports_reasoning and reasoning: if supports_reasoning and reasoning:
request_kwargs['reasoning'] = {'effort': reasoning} # type: ignore request_kwargs["reasoning"] = {"effort": reasoning} # type: ignore
if supports_reasoning and verbosity: if supports_reasoning and verbosity:
request_kwargs['text'] = {'verbosity': verbosity} # type: ignore request_kwargs["text"] = {"verbosity": verbosity} # type: ignore
return await self.client.responses.parse(**request_kwargs) return await self.client.responses.parse(**request_kwargs)
@ -96,20 +96,20 @@ class AzureOpenAILLMClient(BaseOpenAIClient):
supports_reasoning = self._supports_reasoning_features(model) supports_reasoning = self._supports_reasoning_features(model)
request_kwargs = { request_kwargs = {
'model': model, "model": model,
'messages': messages, "messages": messages,
'max_tokens': max_tokens, "max_tokens": max_tokens,
'response_format': {'type': 'json_object'}, "response_format": {"type": "json_object"},
} }
temperature_value = temperature if not supports_reasoning else None temperature_value = temperature if not supports_reasoning else None
if temperature_value is not None: if temperature_value is not None:
request_kwargs['temperature'] = temperature_value request_kwargs["temperature"] = temperature_value
return await self.client.chat.completions.create(**request_kwargs) return await self.client.chat.completions.create(**request_kwargs)
@staticmethod @staticmethod
def _supports_reasoning_features(model: str) -> bool: def _supports_reasoning_features(model: str) -> bool:
"""Return True when the Azure model supports reasoning/verbosity options.""" """Return True when the Azure model supports reasoning/verbosity options."""
reasoning_prefixes = ('o1', 'o3', 'gpt-5') reasoning_prefixes = ("o1", "o3", "gpt-5")
return model.startswith(reasoning_prefixes) return model.startswith(reasoning_prefixes)

View file

@ -48,11 +48,7 @@ def get_extraction_language_instruction(group_id: str | None = None) -> str:
Returns: Returns:
str: Language instruction to append to system messages str: Language instruction to append to system messages
""" """
return ( return '\n\nAny extracted information should be returned in the same language as it was written in.'
'\n\nAny extracted information should be returned in the same language as it was written in. '
'Only output non-English text when the user has written full sentences or phrases in that non-English language. '
'Otherwise, output English.'
)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -45,7 +45,7 @@ else:
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
DEFAULT_MODEL = 'gemini-2.5-flash' DEFAULT_MODEL = 'gemini-2.5-flash'
DEFAULT_SMALL_MODEL = 'gemini-2.5-flash-lite' DEFAULT_SMALL_MODEL = 'gemini-2.5-flash-lite-preview-06-17'
# Maximum output tokens for different Gemini models # Maximum output tokens for different Gemini models
GEMINI_MODEL_MAX_TOKENS = { GEMINI_MODEL_MAX_TOKENS = {
@ -53,6 +53,7 @@ GEMINI_MODEL_MAX_TOKENS = {
'gemini-2.5-pro': 65536, 'gemini-2.5-pro': 65536,
'gemini-2.5-flash': 65536, 'gemini-2.5-flash': 65536,
'gemini-2.5-flash-lite': 64000, 'gemini-2.5-flash-lite': 64000,
'models/gemini-2.5-flash-lite-preview-06-17': 64000,
# Gemini 2.0 models # Gemini 2.0 models
'gemini-2.0-flash': 8192, 'gemini-2.0-flash': 8192,
'gemini-2.0-flash-lite': 8192, 'gemini-2.0-flash-lite': 8192,

View file

@ -31,8 +31,8 @@ from .errors import RateLimitError, RefusalError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
DEFAULT_MODEL = 'gpt-4o-mini' DEFAULT_MODEL = 'gpt-5-mini'
DEFAULT_SMALL_MODEL = 'gpt-4o-mini' DEFAULT_SMALL_MODEL = 'gpt-5-nano'
DEFAULT_REASONING = 'minimal' DEFAULT_REASONING = 'minimal'
DEFAULT_VERBOSITY = 'low' DEFAULT_VERBOSITY = 'low'
@ -166,17 +166,13 @@ class BaseOpenAIClient(LLMClient):
except openai.RateLimitError as e: except openai.RateLimitError as e:
raise RateLimitError from e raise RateLimitError from e
except openai.AuthenticationError as e: except openai.AuthenticationError as e:
logger.error( logger.error(f'OpenAI Authentication Error: {e}. Please verify your API key is correct.')
f'OpenAI Authentication Error: {e}. Please verify your API key is correct.'
)
raise raise
except Exception as e: except Exception as e:
# Provide more context for connection errors # Provide more context for connection errors
error_msg = str(e) error_msg = str(e)
if 'Connection error' in error_msg or 'connection' in error_msg.lower(): if 'Connection error' in error_msg or 'connection' in error_msg.lower():
logger.error( logger.error(f'Connection error communicating with OpenAI API. Please check your network connection and API key. Error: {e}')
f'Connection error communicating with OpenAI API. Please check your network connection and API key. Error: {e}'
)
else: else:
logger.error(f'Error in generating LLM response: {e}') logger.error(f'Error in generating LLM response: {e}')
raise raise

View file

@ -74,9 +74,7 @@ class OpenAIClient(BaseOpenAIClient):
): ):
"""Create a structured completion using OpenAI's beta parse API.""" """Create a structured completion using OpenAI's beta parse API."""
# Reasoning models (gpt-5 family) don't support temperature # Reasoning models (gpt-5 family) don't support temperature
is_reasoning_model = ( is_reasoning_model = model.startswith('gpt-5') or model.startswith('o1') or model.startswith('o3')
model.startswith('gpt-5') or model.startswith('o1') or model.startswith('o3')
)
response = await self.client.responses.parse( response = await self.client.responses.parse(
model=model, model=model,
@ -102,9 +100,7 @@ class OpenAIClient(BaseOpenAIClient):
): ):
"""Create a regular completion with JSON format.""" """Create a regular completion with JSON format."""
# Reasoning models (gpt-5 family) don't support temperature # Reasoning models (gpt-5 family) don't support temperature
is_reasoning_model = ( is_reasoning_model = model.startswith('gpt-5') or model.startswith('o1') or model.startswith('o3')
model.startswith('gpt-5') or model.startswith('o1') or model.startswith('o3')
)
return await self.client.chat.completions.create( return await self.client.chat.completions.create(
model=model, model=model,

View file

@ -41,16 +41,6 @@ class DateFilter(BaseModel):
) )
class PropertyFilter(BaseModel):
property_name: str = Field(description='Property name')
property_value: str | int | float | None = Field(
description='Value you want to match on for the property'
)
comparison_operator: ComparisonOperator = Field(
description='Comparison operator for the property'
)
class SearchFilters(BaseModel): class SearchFilters(BaseModel):
node_labels: list[str] | None = Field( node_labels: list[str] | None = Field(
default=None, description='List of node labels to filter on' default=None, description='List of node labels to filter on'
@ -63,7 +53,6 @@ class SearchFilters(BaseModel):
created_at: list[list[DateFilter]] | None = Field(default=None) created_at: list[list[DateFilter]] | None = Field(default=None)
expired_at: list[list[DateFilter]] | None = Field(default=None) expired_at: list[list[DateFilter]] | None = Field(default=None)
edge_uuids: list[str] | None = Field(default=None) edge_uuids: list[str] | None = Field(default=None)
property_filters: list[PropertyFilter] | None = Field(default=None)
def cypher_to_opensearch_operator(op: ComparisonOperator) -> str: def cypher_to_opensearch_operator(op: ComparisonOperator) -> str:

View file

@ -17,7 +17,7 @@ limitations under the License.
import re import re
# Maximum length for entity/node summaries # Maximum length for entity/node summaries
MAX_SUMMARY_CHARS = 500 MAX_SUMMARY_CHARS = 250
def truncate_at_sentence(text: str, max_chars: int) -> str: def truncate_at_sentence(text: str, max_chars: int) -> str:

View file

@ -1,49 +0,0 @@
# Graphiti MCP Server Environment Configuration
MCP_SERVER_HOST=gmakai.online
# Neo4j Database Configuration
# These settings are used to connect to your Neo4j database
NEO4J_URI=bolt://neo4j:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=kg3Jsdb2
# OpenAI API Configuration
# Required for LLM operations
OPENAI_API_KEY=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
MODEL_NAME=gpt5.1-nano
# Optional: Only needed for non-standard OpenAI endpoints
OPENAI_BASE_URL=https://openrouter.ai/api/v1
# Optional: Group ID for namespacing graph data
# GROUP_ID=my_project
# Concurrency Control
# Controls how many episodes can be processed simultaneously
# Default: 10 (suitable for OpenAI Tier 3, mid-tier Anthropic)
# Adjust based on your LLM provider's rate limits:
# - OpenAI Tier 1 (free): 1-2
# - OpenAI Tier 2: 5-8
# - OpenAI Tier 3: 10-15
# - OpenAI Tier 4: 20-50
# - Anthropic default: 5-8
# - Anthropic high tier: 15-30
# - Ollama (local): 1-5
# See README.md "Concurrency and LLM Provider 429 Rate Limit Errors" for details
SEMAPHORE_LIMIT=10
# Optional: Path configuration for Docker
# PATH=/root/.local/bin:${PATH}
# Optional: Memory settings for Neo4j (used in Docker Compose)
# NEO4J_server_memory_heap_initial__size=512m
# NEO4J_server_memory_heap_max__size=1G
# NEO4J_server_memory_pagecache_size=512m
# Azure OpenAI configuration
# Optional: Only needed for Azure OpenAI endpoints
# AZURE_OPENAI_ENDPOINT=your_azure_openai_endpoint_here
# AZURE_OPENAI_API_VERSION=2025-01-01-preview
# AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o-gpt-4o-mini-deployment
# AZURE_OPENAI_EMBEDDING_API_VERSION=2023-05-15
# AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME=text-embedding-3-large-deployment
# AZURE_OPENAI_USE_MANAGED_IDENTITY=false

View file

@ -8,7 +8,7 @@ server:
llm: llm:
provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq
model: "gpt-4o-mini" model: "gpt-5-mini"
max_tokens: 4096 max_tokens: 4096
providers: providers:

View file

@ -8,7 +8,7 @@ server:
llm: llm:
provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq
model: "gpt-4o-mini" model: "gpt-5-mini"
max_tokens: 4096 max_tokens: 4096
providers: providers:

View file

@ -8,7 +8,7 @@ server:
llm: llm:
provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq
model: "gpt-4o-mini" model: "gpt-5-mini"
max_tokens: 4096 max_tokens: 4096
providers: providers:

View file

@ -12,7 +12,7 @@ server:
llm: llm:
provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq provider: "openai" # Options: openai, azure_openai, anthropic, gemini, groq
model: "gpt-4o-mini" model: "gpt-5-mini"
max_tokens: 4096 max_tokens: 4096
providers: providers:

View file

@ -1,23 +1,23 @@
services: services:
neo4j: neo4j:
image: neo4j:latest image: neo4j:5.26.0
ports: ports:
- "7474:7474" # HTTP - "7474:7474" # HTTP
- "7687:7687" # Bolt - "7687:7687" # Bolt
environment: environment:
- NEO4J_AUTH=${NEO4J_USER:-neo4j}/${NEO4J_PASSWORD:-kg3Jsdb2} - NEO4J_AUTH=${NEO4J_USER:-neo4j}/${NEO4J_PASSWORD:-demodemo}
- NEO4J_server_memory_heap_initial__size=512m
- NEO4J_server_memory_heap_max__size=1G
- NEO4J_server_memory_pagecache_size=512m
volumes: volumes:
- /data/neo4j/data:/data - neo4j_data:/data
- /data/neo4j/logs:/logs - neo4j_logs:/logs
- /data/neo4j/plugins:/plugins
- /data/neo4j/config:/config
healthcheck: healthcheck:
test: ["CMD", "wget", "-O", "/dev/null", "http://localhost:7474"] test: ["CMD", "wget", "-O", "/dev/null", "http://localhost:7474"]
interval: 10s interval: 10s
timeout: 5s timeout: 5s
retries: 5 retries: 5
start_period: 30s start_period: 30s
restart: always
graphiti-mcp: graphiti-mcp:
image: zepai/knowledge-graph-mcp:standalone image: zepai/knowledge-graph-mcp:standalone
@ -27,9 +27,9 @@ services:
build: build:
context: .. context: ..
dockerfile: docker/Dockerfile.standalone dockerfile: docker/Dockerfile.standalone
#env_file: env_file:
# - path: ../.env - path: ../.env
# required: true required: false
depends_on: depends_on:
neo4j: neo4j:
condition: service_healthy condition: service_healthy
@ -37,18 +37,13 @@ services:
# Database configuration # Database configuration
- NEO4J_URI=${NEO4J_URI:-bolt://neo4j:7687} - NEO4J_URI=${NEO4J_URI:-bolt://neo4j:7687}
- NEO4J_USER=${NEO4J_USER:-neo4j} - NEO4J_USER=${NEO4J_USER:-neo4j}
- NEO4J_PASSWORD=${NEO4J_PASSWORD:-kg3Jsdb2} - NEO4J_PASSWORD=${NEO4J_PASSWORD:-demodemo}
- NEO4J_DATABASE=${NEO4J_DATABASE:-neo4j} - NEO4J_DATABASE=${NEO4J_DATABASE:-neo4j}
# Application configuration # Application configuration
- GRAPHITI_GROUP_ID=${GRAPHITI_GROUP_ID:-main} - GRAPHITI_GROUP_ID=${GRAPHITI_GROUP_ID:-main}
- SEMAPHORE_LIMIT=${SEMAPHORE_LIMIT:-10} - SEMAPHORE_LIMIT=${SEMAPHORE_LIMIT:-10}
- CONFIG_PATH=/app/mcp/config/config.yaml - CONFIG_PATH=/app/mcp/config/config.yaml
- PATH=/root/.local/bin:${PATH} - PATH=/root/.local/bin:${PATH}
- MCP_SERVER_HOST=gmakai.online
- OPENAI_API_KEY=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
- MODEL_NAME=gpt5.1-nano
- OPENAI_BASE_URL=https://openrouter.ai/api/v1
volumes: volumes:
- ../config/config-docker-neo4j.yaml:/app/mcp/config/config.yaml:ro - ../config/config-docker-neo4j.yaml:/app/mcp/config/config.yaml:ro
ports: ports:

View file

@ -147,7 +147,7 @@ class LLMConfig(BaseModel):
"""LLM configuration.""" """LLM configuration."""
provider: str = Field(default='openai', description='LLM provider') provider: str = Field(default='openai', description='LLM provider')
model: str = Field(default='gpt-4o-mini', description='Model name') model: str = Field(default='gpt-4.1', description='Model name')
temperature: float | None = Field( temperature: float | None = Field(
default=None, description='Temperature (optional, defaults to None for reasoning models)' default=None, description='Temperature (optional, defaults to None for reasoning models)'
) )

View file

@ -1,7 +1,7 @@
[project] [project]
name = "graphiti-core" name = "graphiti-core"
description = "A temporal graph building library" description = "A temporal graph building library"
version = "0.24.3" version = "0.24.0"
authors = [ authors = [
{ name = "Paul Paliychuk", email = "paul@getzep.com" }, { name = "Paul Paliychuk", email = "paul@getzep.com" },
{ name = "Preston Rasmussen", email = "preston@getzep.com" }, { name = "Preston Rasmussen", email = "preston@getzep.com" },

View file

@ -455,62 +455,6 @@
"created_at": "2025-11-06T08:39:46Z", "created_at": "2025-11-06T08:39:46Z",
"repoId": 840056306, "repoId": 840056306,
"pullRequestNo": 1053 "pullRequestNo": 1053
},
{
"name": "supmo668",
"id": 28805779,
"comment_id": 3550309664,
"created_at": "2025-11-19T01:56:25Z",
"repoId": 840056306,
"pullRequestNo": 1072
},
{
"name": "donbr",
"id": 7340008,
"comment_id": 3568970102,
"created_at": "2025-11-24T05:19:42Z",
"repoId": 840056306,
"pullRequestNo": 1081
},
{
"name": "apetti1920",
"id": 4706645,
"comment_id": 3572726648,
"created_at": "2025-11-24T21:07:34Z",
"repoId": 840056306,
"pullRequestNo": 1084
},
{
"name": "ZLBillShaw",
"id": 55940186,
"comment_id": 3583997833,
"created_at": "2025-11-27T02:45:53Z",
"repoId": 840056306,
"pullRequestNo": 1085
},
{
"name": "ronaldmego",
"id": 17481958,
"comment_id": 3617267429,
"created_at": "2025-12-05T14:59:42Z",
"repoId": 840056306,
"pullRequestNo": 1094
},
{
"name": "NShumway",
"id": 29358113,
"comment_id": 3634967978,
"created_at": "2025-12-10T01:26:49Z",
"repoId": 840056306,
"pullRequestNo": 1102
},
{
"name": "husniadil",
"id": 10581130,
"comment_id": 3650156180,
"created_at": "2025-12-14T03:37:59Z",
"repoId": 840056306,
"pullRequestNo": 1105
} }
] ]
} }

View file

@ -81,7 +81,7 @@ class TestAnthropicClientInitialization:
config = LLMConfig(api_key='test_api_key') config = LLMConfig(api_key='test_api_key')
client = AnthropicClient(config=config, cache=False) client = AnthropicClient(config=config, cache=False)
assert client.model == 'claude-haiku-4-5-latest' assert client.model == 'claude-3-7-sonnet-latest'
@patch.dict(os.environ, {'ANTHROPIC_API_KEY': 'env_api_key'}) @patch.dict(os.environ, {'ANTHROPIC_API_KEY': 'env_api_key'})
def test_init_without_config(self): def test_init_without_config(self):
@ -89,7 +89,7 @@ class TestAnthropicClientInitialization:
client = AnthropicClient(cache=False) client = AnthropicClient(cache=False)
assert client.config.api_key == 'env_api_key' assert client.config.api_key == 'env_api_key'
assert client.model == 'claude-haiku-4-5-latest' assert client.model == 'claude-3-7-sonnet-latest'
def test_init_with_custom_client(self): def test_init_with_custom_client(self):
"""Test initialization with a custom AsyncAnthropic client.""" """Test initialization with a custom AsyncAnthropic client."""

View file

@ -455,6 +455,7 @@ class TestGeminiClientGenerateResponse:
('gemini-2.5-flash', 65536), ('gemini-2.5-flash', 65536),
('gemini-2.5-pro', 65536), ('gemini-2.5-pro', 65536),
('gemini-2.5-flash-lite', 64000), ('gemini-2.5-flash-lite', 64000),
('models/gemini-2.5-flash-lite-preview-06-17', 64000),
('gemini-2.0-flash', 8192), ('gemini-2.0-flash', 8192),
('gemini-1.5-pro', 8192), ('gemini-1.5-pro', 8192),
('gemini-1.5-flash', 8192), ('gemini-1.5-flash', 8192),

View file

@ -87,7 +87,7 @@ def test_truncate_at_sentence_strips_trailing_whitespace():
def test_max_summary_chars_constant(): def test_max_summary_chars_constant():
"""Test that MAX_SUMMARY_CHARS is set to expected value.""" """Test that MAX_SUMMARY_CHARS is set to expected value."""
assert MAX_SUMMARY_CHARS == 500 assert MAX_SUMMARY_CHARS == 250
def test_truncate_at_sentence_realistic_summary(): def test_truncate_at_sentence_realistic_summary():

137
uv.lock generated
View file

@ -1,5 +1,5 @@
version = 1 version = 1
revision = 2 revision = 3
requires-python = ">=3.10, <4" requires-python = ">=3.10, <4"
resolution-markers = [ resolution-markers = [
"python_full_version >= '3.14'", "python_full_version >= '3.14'",
@ -262,6 +262,35 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" }, { url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" },
] ]
[[package]]
name = "azure-core"
version = "1.36.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "requests" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/0a/c4/d4ff3bc3ddf155156460bff340bbe9533f99fac54ddea165f35a8619f162/azure_core-1.36.0.tar.gz", hash = "sha256:22e5605e6d0bf1d229726af56d9e92bc37b6e726b141a18be0b4d424131741b7", size = 351139, upload-time = "2025-10-15T00:33:49.083Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b1/3c/b90d5afc2e47c4a45f4bba00f9c3193b0417fad5ad3bb07869f9d12832aa/azure_core-1.36.0-py3-none-any.whl", hash = "sha256:fee9923a3a753e94a259563429f3644aaf05c486d45b1215d098115102d91d3b", size = 213302, upload-time = "2025-10-15T00:33:51.058Z" },
]
[[package]]
name = "azure-identity"
version = "1.25.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "azure-core" },
{ name = "cryptography" },
{ name = "msal" },
{ name = "msal-extensions" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/06/8d/1a6c41c28a37eab26dc85ab6c86992c700cd3f4a597d9ed174b0e9c69489/azure_identity-1.25.1.tar.gz", hash = "sha256:87ca8328883de6036443e1c37b40e8dc8fb74898240f61071e09d2e369361456", size = 279826, upload-time = "2025-10-06T20:30:02.194Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/83/7b/5652771e24fff12da9dde4c20ecf4682e606b104f26419d139758cc935a6/azure_identity-1.25.1-py3-none-any.whl", hash = "sha256:e9edd720af03dff020223cd269fa3a61e8f345ea75443858273bcb44844ab651", size = 191317, upload-time = "2025-10-06T20:30:04.251Z" },
]
[[package]] [[package]]
name = "babel" name = "babel"
version = "2.17.0" version = "2.17.0"
@ -520,6 +549,71 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e6/75/49e5bfe642f71f272236b5b2d2691cf915a7283cc0ceda56357b61daa538/comm-0.2.2-py3-none-any.whl", hash = "sha256:e6fb86cb70ff661ee8c9c14e7d36d6de3b4066f1441be4063df9c5009f0a64d3", size = 7180, upload-time = "2024-03-12T16:53:39.226Z" }, { url = "https://files.pythonhosted.org/packages/e6/75/49e5bfe642f71f272236b5b2d2691cf915a7283cc0ceda56357b61daa538/comm-0.2.2-py3-none-any.whl", hash = "sha256:e6fb86cb70ff661ee8c9c14e7d36d6de3b4066f1441be4063df9c5009f0a64d3", size = 7180, upload-time = "2024-03-12T16:53:39.226Z" },
] ]
[[package]]
name = "cryptography"
version = "46.0.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
{ name = "typing-extensions", marker = "python_full_version < '3.11'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/9f/33/c00162f49c0e2fe8064a62cb92b93e50c74a72bc370ab92f86112b33ff62/cryptography-46.0.3.tar.gz", hash = "sha256:a8b17438104fed022ce745b362294d9ce35b4c2e45c1d958ad4a4b019285f4a1", size = 749258, upload-time = "2025-10-15T23:18:31.74Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/1d/42/9c391dd801d6cf0d561b5890549d4b27bafcc53b39c31a817e69d87c625b/cryptography-46.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:109d4ddfadf17e8e7779c39f9b18111a09efb969a301a31e987416a0191ed93a", size = 7225004, upload-time = "2025-10-15T23:16:52.239Z" },
{ url = "https://files.pythonhosted.org/packages/1c/67/38769ca6b65f07461eb200e85fc1639b438bdc667be02cf7f2cd6a64601c/cryptography-46.0.3-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:09859af8466b69bc3c27bdf4f5d84a665e0f7ab5088412e9e2ec49758eca5cbc", size = 4296667, upload-time = "2025-10-15T23:16:54.369Z" },
{ url = "https://files.pythonhosted.org/packages/5c/49/498c86566a1d80e978b42f0d702795f69887005548c041636df6ae1ca64c/cryptography-46.0.3-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:01ca9ff2885f3acc98c29f1860552e37f6d7c7d013d7334ff2a9de43a449315d", size = 4450807, upload-time = "2025-10-15T23:16:56.414Z" },
{ url = "https://files.pythonhosted.org/packages/4b/0a/863a3604112174c8624a2ac3c038662d9e59970c7f926acdcfaed8d61142/cryptography-46.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6eae65d4c3d33da080cff9c4ab1f711b15c1d9760809dad6ea763f3812d254cb", size = 4299615, upload-time = "2025-10-15T23:16:58.442Z" },
{ url = "https://files.pythonhosted.org/packages/64/02/b73a533f6b64a69f3cd3872acb6ebc12aef924d8d103133bb3ea750dc703/cryptography-46.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5bf0ed4490068a2e72ac03d786693adeb909981cc596425d09032d372bcc849", size = 4016800, upload-time = "2025-10-15T23:17:00.378Z" },
{ url = "https://files.pythonhosted.org/packages/25/d5/16e41afbfa450cde85a3b7ec599bebefaef16b5c6ba4ec49a3532336ed72/cryptography-46.0.3-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:5ecfccd2329e37e9b7112a888e76d9feca2347f12f37918facbb893d7bb88ee8", size = 4984707, upload-time = "2025-10-15T23:17:01.98Z" },
{ url = "https://files.pythonhosted.org/packages/c9/56/e7e69b427c3878352c2fb9b450bd0e19ed552753491d39d7d0a2f5226d41/cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a2c0cd47381a3229c403062f764160d57d4d175e022c1df84e168c6251a22eec", size = 4482541, upload-time = "2025-10-15T23:17:04.078Z" },
{ url = "https://files.pythonhosted.org/packages/78/f6/50736d40d97e8483172f1bb6e698895b92a223dba513b0ca6f06b2365339/cryptography-46.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:549e234ff32571b1f4076ac269fcce7a808d3bf98b76c8dd560e42dbc66d7d91", size = 4299464, upload-time = "2025-10-15T23:17:05.483Z" },
{ url = "https://files.pythonhosted.org/packages/00/de/d8e26b1a855f19d9994a19c702fa2e93b0456beccbcfe437eda00e0701f2/cryptography-46.0.3-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:c0a7bb1a68a5d3471880e264621346c48665b3bf1c3759d682fc0864c540bd9e", size = 4950838, upload-time = "2025-10-15T23:17:07.425Z" },
{ url = "https://files.pythonhosted.org/packages/8f/29/798fc4ec461a1c9e9f735f2fc58741b0daae30688f41b2497dcbc9ed1355/cryptography-46.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:10b01676fc208c3e6feeb25a8b83d81767e8059e1fe86e1dc62d10a3018fa926", size = 4481596, upload-time = "2025-10-15T23:17:09.343Z" },
{ url = "https://files.pythonhosted.org/packages/15/8d/03cd48b20a573adfff7652b76271078e3045b9f49387920e7f1f631d125e/cryptography-46.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0abf1ffd6e57c67e92af68330d05760b7b7efb243aab8377e583284dbab72c71", size = 4426782, upload-time = "2025-10-15T23:17:11.22Z" },
{ url = "https://files.pythonhosted.org/packages/fa/b1/ebacbfe53317d55cf33165bda24c86523497a6881f339f9aae5c2e13e57b/cryptography-46.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a04bee9ab6a4da801eb9b51f1b708a1b5b5c9eb48c03f74198464c66f0d344ac", size = 4698381, upload-time = "2025-10-15T23:17:12.829Z" },
{ url = "https://files.pythonhosted.org/packages/96/92/8a6a9525893325fc057a01f654d7efc2c64b9de90413adcf605a85744ff4/cryptography-46.0.3-cp311-abi3-win32.whl", hash = "sha256:f260d0d41e9b4da1ed1e0f1ce571f97fe370b152ab18778e9e8f67d6af432018", size = 3055988, upload-time = "2025-10-15T23:17:14.65Z" },
{ url = "https://files.pythonhosted.org/packages/7e/bf/80fbf45253ea585a1e492a6a17efcb93467701fa79e71550a430c5e60df0/cryptography-46.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:a9a3008438615669153eb86b26b61e09993921ebdd75385ddd748702c5adfddb", size = 3514451, upload-time = "2025-10-15T23:17:16.142Z" },
{ url = "https://files.pythonhosted.org/packages/2e/af/9b302da4c87b0beb9db4e756386a7c6c5b8003cd0e742277888d352ae91d/cryptography-46.0.3-cp311-abi3-win_arm64.whl", hash = "sha256:5d7f93296ee28f68447397bf5198428c9aeeab45705a55d53a6343455dcb2c3c", size = 2928007, upload-time = "2025-10-15T23:17:18.04Z" },
{ url = "https://files.pythonhosted.org/packages/f5/e2/a510aa736755bffa9d2f75029c229111a1d02f8ecd5de03078f4c18d91a3/cryptography-46.0.3-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:00a5e7e87938e5ff9ff5447ab086a5706a957137e6e433841e9d24f38a065217", size = 7158012, upload-time = "2025-10-15T23:17:19.982Z" },
{ url = "https://files.pythonhosted.org/packages/73/dc/9aa866fbdbb95b02e7f9d086f1fccfeebf8953509b87e3f28fff927ff8a0/cryptography-46.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c8daeb2d2174beb4575b77482320303f3d39b8e81153da4f0fb08eb5fe86a6c5", size = 4288728, upload-time = "2025-10-15T23:17:21.527Z" },
{ url = "https://files.pythonhosted.org/packages/c5/fd/bc1daf8230eaa075184cbbf5f8cd00ba9db4fd32d63fb83da4671b72ed8a/cryptography-46.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:39b6755623145ad5eff1dab323f4eae2a32a77a7abef2c5089a04a3d04366715", size = 4435078, upload-time = "2025-10-15T23:17:23.042Z" },
{ url = "https://files.pythonhosted.org/packages/82/98/d3bd5407ce4c60017f8ff9e63ffee4200ab3e23fe05b765cab805a7db008/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:db391fa7c66df6762ee3f00c95a89e6d428f4d60e7abc8328f4fe155b5ac6e54", size = 4293460, upload-time = "2025-10-15T23:17:24.885Z" },
{ url = "https://files.pythonhosted.org/packages/26/e9/e23e7900983c2b8af7a08098db406cf989d7f09caea7897e347598d4cd5b/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:78a97cf6a8839a48c49271cdcbd5cf37ca2c1d6b7fdd86cc864f302b5e9bf459", size = 3995237, upload-time = "2025-10-15T23:17:26.449Z" },
{ url = "https://files.pythonhosted.org/packages/91/15/af68c509d4a138cfe299d0d7ddb14afba15233223ebd933b4bbdbc7155d3/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:dfb781ff7eaa91a6f7fd41776ec37c5853c795d3b358d4896fdbb5df168af422", size = 4967344, upload-time = "2025-10-15T23:17:28.06Z" },
{ url = "https://files.pythonhosted.org/packages/ca/e3/8643d077c53868b681af077edf6b3cb58288b5423610f21c62aadcbe99f4/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:6f61efb26e76c45c4a227835ddeae96d83624fb0d29eb5df5b96e14ed1a0afb7", size = 4466564, upload-time = "2025-10-15T23:17:29.665Z" },
{ url = "https://files.pythonhosted.org/packages/0e/43/c1e8726fa59c236ff477ff2b5dc071e54b21e5a1e51aa2cee1676f1c986f/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:23b1a8f26e43f47ceb6d6a43115f33a5a37d57df4ea0ca295b780ae8546e8044", size = 4292415, upload-time = "2025-10-15T23:17:31.686Z" },
{ url = "https://files.pythonhosted.org/packages/42/f9/2f8fefdb1aee8a8e3256a0568cffc4e6d517b256a2fe97a029b3f1b9fe7e/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:b419ae593c86b87014b9be7396b385491ad7f320bde96826d0dd174459e54665", size = 4931457, upload-time = "2025-10-15T23:17:33.478Z" },
{ url = "https://files.pythonhosted.org/packages/79/30/9b54127a9a778ccd6d27c3da7563e9f2d341826075ceab89ae3b41bf5be2/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:50fc3343ac490c6b08c0cf0d704e881d0d660be923fd3076db3e932007e726e3", size = 4466074, upload-time = "2025-10-15T23:17:35.158Z" },
{ url = "https://files.pythonhosted.org/packages/ac/68/b4f4a10928e26c941b1b6a179143af9f4d27d88fe84a6a3c53592d2e76bf/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22d7e97932f511d6b0b04f2bfd818d73dcd5928db509460aaf48384778eb6d20", size = 4420569, upload-time = "2025-10-15T23:17:37.188Z" },
{ url = "https://files.pythonhosted.org/packages/a3/49/3746dab4c0d1979888f125226357d3262a6dd40e114ac29e3d2abdf1ec55/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d55f3dffadd674514ad19451161118fd010988540cee43d8bc20675e775925de", size = 4681941, upload-time = "2025-10-15T23:17:39.236Z" },
{ url = "https://files.pythonhosted.org/packages/fd/30/27654c1dbaf7e4a3531fa1fc77986d04aefa4d6d78259a62c9dc13d7ad36/cryptography-46.0.3-cp314-cp314t-win32.whl", hash = "sha256:8a6e050cb6164d3f830453754094c086ff2d0b2f3a897a1d9820f6139a1f0914", size = 3022339, upload-time = "2025-10-15T23:17:40.888Z" },
{ url = "https://files.pythonhosted.org/packages/f6/30/640f34ccd4d2a1bc88367b54b926b781b5a018d65f404d409aba76a84b1c/cryptography-46.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:760f83faa07f8b64e9c33fc963d790a2edb24efb479e3520c14a45741cd9b2db", size = 3494315, upload-time = "2025-10-15T23:17:42.769Z" },
{ url = "https://files.pythonhosted.org/packages/ba/8b/88cc7e3bd0a8e7b861f26981f7b820e1f46aa9d26cc482d0feba0ecb4919/cryptography-46.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:516ea134e703e9fe26bcd1277a4b59ad30586ea90c365a87781d7887a646fe21", size = 2919331, upload-time = "2025-10-15T23:17:44.468Z" },
{ url = "https://files.pythonhosted.org/packages/fd/23/45fe7f376a7df8daf6da3556603b36f53475a99ce4faacb6ba2cf3d82021/cryptography-46.0.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:cb3d760a6117f621261d662bccc8ef5bc32ca673e037c83fbe565324f5c46936", size = 7218248, upload-time = "2025-10-15T23:17:46.294Z" },
{ url = "https://files.pythonhosted.org/packages/27/32/b68d27471372737054cbd34c84981f9edbc24fe67ca225d389799614e27f/cryptography-46.0.3-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4b7387121ac7d15e550f5cb4a43aef2559ed759c35df7336c402bb8275ac9683", size = 4294089, upload-time = "2025-10-15T23:17:48.269Z" },
{ url = "https://files.pythonhosted.org/packages/26/42/fa8389d4478368743e24e61eea78846a0006caffaf72ea24a15159215a14/cryptography-46.0.3-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:15ab9b093e8f09daab0f2159bb7e47532596075139dd74365da52ecc9cb46c5d", size = 4440029, upload-time = "2025-10-15T23:17:49.837Z" },
{ url = "https://files.pythonhosted.org/packages/5f/eb/f483db0ec5ac040824f269e93dd2bd8a21ecd1027e77ad7bdf6914f2fd80/cryptography-46.0.3-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:46acf53b40ea38f9c6c229599a4a13f0d46a6c3fa9ef19fc1a124d62e338dfa0", size = 4297222, upload-time = "2025-10-15T23:17:51.357Z" },
{ url = "https://files.pythonhosted.org/packages/fd/cf/da9502c4e1912cb1da3807ea3618a6829bee8207456fbbeebc361ec38ba3/cryptography-46.0.3-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10ca84c4668d066a9878890047f03546f3ae0a6b8b39b697457b7757aaf18dbc", size = 4012280, upload-time = "2025-10-15T23:17:52.964Z" },
{ url = "https://files.pythonhosted.org/packages/6b/8f/9adb86b93330e0df8b3dcf03eae67c33ba89958fc2e03862ef1ac2b42465/cryptography-46.0.3-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:36e627112085bb3b81b19fed209c05ce2a52ee8b15d161b7c643a7d5a88491f3", size = 4978958, upload-time = "2025-10-15T23:17:54.965Z" },
{ url = "https://files.pythonhosted.org/packages/d1/a0/5fa77988289c34bdb9f913f5606ecc9ada1adb5ae870bd0d1054a7021cc4/cryptography-46.0.3-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1000713389b75c449a6e979ffc7dcc8ac90b437048766cef052d4d30b8220971", size = 4473714, upload-time = "2025-10-15T23:17:56.754Z" },
{ url = "https://files.pythonhosted.org/packages/14/e5/fc82d72a58d41c393697aa18c9abe5ae1214ff6f2a5c18ac470f92777895/cryptography-46.0.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:b02cf04496f6576afffef5ddd04a0cb7d49cf6be16a9059d793a30b035f6b6ac", size = 4296970, upload-time = "2025-10-15T23:17:58.588Z" },
{ url = "https://files.pythonhosted.org/packages/78/06/5663ed35438d0b09056973994f1aec467492b33bd31da36e468b01ec1097/cryptography-46.0.3-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:71e842ec9bc7abf543b47cf86b9a743baa95f4677d22baa4c7d5c69e49e9bc04", size = 4940236, upload-time = "2025-10-15T23:18:00.897Z" },
{ url = "https://files.pythonhosted.org/packages/fc/59/873633f3f2dcd8a053b8dd1d38f783043b5fce589c0f6988bf55ef57e43e/cryptography-46.0.3-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:402b58fc32614f00980b66d6e56a5b4118e6cb362ae8f3fda141ba4689bd4506", size = 4472642, upload-time = "2025-10-15T23:18:02.749Z" },
{ url = "https://files.pythonhosted.org/packages/3d/39/8e71f3930e40f6877737d6f69248cf74d4e34b886a3967d32f919cc50d3b/cryptography-46.0.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef639cb3372f69ec44915fafcd6698b6cc78fbe0c2ea41be867f6ed612811963", size = 4423126, upload-time = "2025-10-15T23:18:04.85Z" },
{ url = "https://files.pythonhosted.org/packages/cd/c7/f65027c2810e14c3e7268353b1681932b87e5a48e65505d8cc17c99e36ae/cryptography-46.0.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3b51b8ca4f1c6453d8829e1eb7299499ca7f313900dd4d89a24b8b87c0a780d4", size = 4686573, upload-time = "2025-10-15T23:18:06.908Z" },
{ url = "https://files.pythonhosted.org/packages/0a/6e/1c8331ddf91ca4730ab3086a0f1be19c65510a33b5a441cb334e7a2d2560/cryptography-46.0.3-cp38-abi3-win32.whl", hash = "sha256:6276eb85ef938dc035d59b87c8a7dc559a232f954962520137529d77b18ff1df", size = 3036695, upload-time = "2025-10-15T23:18:08.672Z" },
{ url = "https://files.pythonhosted.org/packages/90/45/b0d691df20633eff80955a0fc7695ff9051ffce8b69741444bd9ed7bd0db/cryptography-46.0.3-cp38-abi3-win_amd64.whl", hash = "sha256:416260257577718c05135c55958b674000baef9a1c7d9e8f306ec60d71db850f", size = 3501720, upload-time = "2025-10-15T23:18:10.632Z" },
{ url = "https://files.pythonhosted.org/packages/e8/cb/2da4cc83f5edb9c3257d09e1e7ab7b23f049c7962cae8d842bbef0a9cec9/cryptography-46.0.3-cp38-abi3-win_arm64.whl", hash = "sha256:d89c3468de4cdc4f08a57e214384d0471911a3830fcdaf7a8cc587e42a866372", size = 2918740, upload-time = "2025-10-15T23:18:12.277Z" },
{ url = "https://files.pythonhosted.org/packages/d9/cd/1a8633802d766a0fa46f382a77e096d7e209e0817892929655fe0586ae32/cryptography-46.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:a23582810fedb8c0bc47524558fb6c56aac3fc252cb306072fd2815da2a47c32", size = 3689163, upload-time = "2025-10-15T23:18:13.821Z" },
{ url = "https://files.pythonhosted.org/packages/4c/59/6b26512964ace6480c3e54681a9859c974172fb141c38df11eadd8416947/cryptography-46.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:e7aec276d68421f9574040c26e2a7c3771060bc0cff408bae1dcb19d3ab1e63c", size = 3429474, upload-time = "2025-10-15T23:18:15.477Z" },
{ url = "https://files.pythonhosted.org/packages/06/8a/e60e46adab4362a682cf142c7dcb5bf79b782ab2199b0dcb81f55970807f/cryptography-46.0.3-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:7ce938a99998ed3c8aa7e7272dca1a610401ede816d36d0693907d863b10d9ea", size = 3698132, upload-time = "2025-10-15T23:18:17.056Z" },
{ url = "https://files.pythonhosted.org/packages/da/38/f59940ec4ee91e93d3311f7532671a5cef5570eb04a144bf203b58552d11/cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:191bb60a7be5e6f54e30ba16fdfae78ad3a342a0599eb4193ba88e3f3d6e185b", size = 4243992, upload-time = "2025-10-15T23:18:18.695Z" },
{ url = "https://files.pythonhosted.org/packages/b0/0c/35b3d92ddebfdfda76bb485738306545817253d0a3ded0bfe80ef8e67aa5/cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c70cc23f12726be8f8bc72e41d5065d77e4515efae3690326764ea1b07845cfb", size = 4409944, upload-time = "2025-10-15T23:18:20.597Z" },
{ url = "https://files.pythonhosted.org/packages/99/55/181022996c4063fc0e7666a47049a1ca705abb9c8a13830f074edb347495/cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:9394673a9f4de09e28b5356e7fff97d778f8abad85c9d5ac4a4b7e25a0de7717", size = 4242957, upload-time = "2025-10-15T23:18:22.18Z" },
{ url = "https://files.pythonhosted.org/packages/ba/af/72cd6ef29f9c5f731251acadaeb821559fe25f10852f44a63374c9ca08c1/cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:94cd0549accc38d1494e1f8de71eca837d0509d0d44bf11d158524b0e12cebf9", size = 4409447, upload-time = "2025-10-15T23:18:24.209Z" },
{ url = "https://files.pythonhosted.org/packages/0d/c3/e90f4a4feae6410f914f8ebac129b9ae7a8c92eb60a638012dde42030a9d/cryptography-46.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6b5063083824e5509fdba180721d55909ffacccc8adbec85268b48439423d78c", size = 3438528, upload-time = "2025-10-15T23:18:26.227Z" },
]
[[package]] [[package]]
name = "debugpy" name = "debugpy"
version = "1.8.14" version = "1.8.14"
@ -808,7 +902,7 @@ wheels = [
[[package]] [[package]]
name = "graphiti-core" name = "graphiti-core"
version = "0.24.3" version = "0.23.1"
source = { editable = "." } source = { editable = "." }
dependencies = [ dependencies = [
{ name = "diskcache" }, { name = "diskcache" },
@ -826,8 +920,12 @@ dependencies = [
anthropic = [ anthropic = [
{ name = "anthropic" }, { name = "anthropic" },
] ]
azure = [
{ name = "azure-identity" },
]
dev = [ dev = [
{ name = "anthropic" }, { name = "anthropic" },
{ name = "azure-identity" },
{ name = "boto3" }, { name = "boto3" },
{ name = "diskcache-stubs" }, { name = "diskcache-stubs" },
{ name = "falkordb" }, { name = "falkordb" },
@ -888,6 +986,8 @@ voyageai = [
requires-dist = [ requires-dist = [
{ name = "anthropic", marker = "extra == 'anthropic'", specifier = ">=0.49.0" }, { name = "anthropic", marker = "extra == 'anthropic'", specifier = ">=0.49.0" },
{ name = "anthropic", marker = "extra == 'dev'", specifier = ">=0.49.0" }, { name = "anthropic", marker = "extra == 'dev'", specifier = ">=0.49.0" },
{ name = "azure-identity", marker = "extra == 'azure'", specifier = ">=1.25.1" },
{ name = "azure-identity", marker = "extra == 'dev'", specifier = ">=1.25.1" },
{ name = "boto3", marker = "extra == 'dev'", specifier = ">=1.39.16" }, { name = "boto3", marker = "extra == 'dev'", specifier = ">=1.39.16" },
{ name = "boto3", marker = "extra == 'neo4j-opensearch'", specifier = ">=1.39.16" }, { name = "boto3", marker = "extra == 'neo4j-opensearch'", specifier = ">=1.39.16" },
{ name = "boto3", marker = "extra == 'neptune'", specifier = ">=1.39.16" }, { name = "boto3", marker = "extra == 'neptune'", specifier = ">=1.39.16" },
@ -933,7 +1033,7 @@ requires-dist = [
{ name = "voyageai", marker = "extra == 'dev'", specifier = ">=0.2.3" }, { name = "voyageai", marker = "extra == 'dev'", specifier = ">=0.2.3" },
{ name = "voyageai", marker = "extra == 'voyageai'", specifier = ">=0.2.3" }, { name = "voyageai", marker = "extra == 'voyageai'", specifier = ">=0.2.3" },
] ]
provides-extras = ["anthropic", "groq", "google-genai", "kuzu", "falkordb", "voyageai", "neo4j-opensearch", "sentence-transformers", "neptune", "tracing", "dev"] provides-extras = ["anthropic", "groq", "google-genai", "kuzu", "falkordb", "voyageai", "neo4j-opensearch", "sentence-transformers", "neptune", "tracing", "azure", "dev"]
[[package]] [[package]]
name = "groq" name = "groq"
@ -1711,6 +1811,32 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c", size = 536198, upload-time = "2023-03-07T16:47:09.197Z" }, { url = "https://files.pythonhosted.org/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c", size = 536198, upload-time = "2023-03-07T16:47:09.197Z" },
] ]
[[package]]
name = "msal"
version = "1.34.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cryptography" },
{ name = "pyjwt", extra = ["crypto"] },
{ name = "requests" },
]
sdist = { url = "https://files.pythonhosted.org/packages/cf/0e/c857c46d653e104019a84f22d4494f2119b4fe9f896c92b4b864b3b045cc/msal-1.34.0.tar.gz", hash = "sha256:76ba83b716ea5a6d75b0279c0ac353a0e05b820ca1f6682c0eb7f45190c43c2f", size = 153961, upload-time = "2025-09-22T23:05:48.989Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c2/dc/18d48843499e278538890dc709e9ee3dea8375f8be8e82682851df1b48b5/msal-1.34.0-py3-none-any.whl", hash = "sha256:f669b1644e4950115da7a176441b0e13ec2975c29528d8b9e81316023676d6e1", size = 116987, upload-time = "2025-09-22T23:05:47.294Z" },
]
[[package]]
name = "msal-extensions"
version = "1.3.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "msal" },
]
sdist = { url = "https://files.pythonhosted.org/packages/01/99/5d239b6156eddf761a636bded1118414d161bd6b7b37a9335549ed159396/msal_extensions-1.3.1.tar.gz", hash = "sha256:c5b0fd10f65ef62b5f1d62f4251d51cbcaf003fcedae8c91b040a488614be1a4", size = 23315, upload-time = "2025-03-14T23:51:03.902Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/5e/75/bd9b7bb966668920f06b200e84454c8f3566b102183bc55c5473d96cb2b9/msal_extensions-1.3.1-py3-none-any.whl", hash = "sha256:96d3de4d034504e969ac5e85bae8106c8373b5c6568e4c8fa7af2eca9dbe6bca", size = 20583, upload-time = "2025-03-14T23:51:03.016Z" },
]
[[package]] [[package]]
name = "multidict" name = "multidict"
version = "6.6.3" version = "6.6.3"
@ -2814,6 +2940,11 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/79/84/0fdf9b18ba31d69877bd39c9cd6052b47f3761e9910c15de788e519f079f/PyJWT-2.9.0-py3-none-any.whl", hash = "sha256:3b02fb0f44517787776cf48f2ae25d8e14f300e6d7545a4315cee571a415e850", size = 22344, upload-time = "2024-08-01T15:01:06.481Z" }, { url = "https://files.pythonhosted.org/packages/79/84/0fdf9b18ba31d69877bd39c9cd6052b47f3761e9910c15de788e519f079f/PyJWT-2.9.0-py3-none-any.whl", hash = "sha256:3b02fb0f44517787776cf48f2ae25d8e14f300e6d7545a4315cee571a415e850", size = 22344, upload-time = "2024-08-01T15:01:06.481Z" },
] ]
[package.optional-dependencies]
crypto = [
{ name = "cryptography" },
]
[[package]] [[package]]
name = "pyright" name = "pyright"
version = "1.1.404" version = "1.1.404"