graphiti/graphiti_core/utils/maintenance
Daniel Chalef 196eb2f077
Remove JSON indentation from prompts to reduce token usage (#985)
Changes to `to_prompt_json()` helper to default to minified JSON (no indentation) instead of 2-space indentation. This reduces token consumption in LLM prompts while maintaining all necessary information.

- Changed default `indent` parameter from `2` to `None` in `prompt_helpers.py`
- Updated all prompt modules to remove explicit `indent=2` arguments
- Minor code formatting fixes in LLM clients

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-06 16:08:43 -07:00
..
__init__.py Gemini support (#324) 2025-04-06 09:27:04 -07:00
community_operations.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
dedup_helpers.py Refactor batch deduplication logic to enhance node resolution and track duplicate pairs (#929) (#936) 2025-09-26 08:40:18 -07:00
edge_operations.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
graph_data_operations.py OpenSearch updates (#906) 2025-09-14 01:43:37 -04:00
node_operations.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00
temporal_operations.py Add OpenTelemetry distributed tracing support (#982) 2025-10-05 12:26:14 -07:00