graphiti/graphiti_core/prompts
Daniel Chalef 196eb2f077
Remove JSON indentation from prompts to reduce token usage (#985)
Changes to `to_prompt_json()` helper to default to minified JSON (no indentation) instead of 2-space indentation. This reduces token consumption in LLM prompts while maintaining all necessary information.

- Changed default `indent` parameter from `2` to `None` in `prompt_helpers.py`
- Updated all prompt modules to remove explicit `indent=2` arguments
- Minor code formatting fixes in LLM clients

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-06 16:08:43 -07:00
..
__init__.py chore: Fix packaging (#38) 2024-08-25 10:07:50 -07:00
dedupe_edges.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
dedupe_nodes.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
eval.py Remove ensure_ascii configuration parameter (#969) 2025-10-02 15:10:57 -07:00
extract_edge_dates.py prompt update (#378) 2025-04-18 00:09:12 -04:00
extract_edges.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
extract_nodes.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
invalidate_edges.py Node dedupe efficiency (#490) 2025-05-15 13:56:33 -04:00
lib.py Gemini support (#324) 2025-04-06 09:27:04 -07:00
models.py chore: update version to 0.9.3 and restructure dependencies (#338) 2025-04-08 20:47:38 -07:00
prompt_helpers.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00
snippets.py Refactor summary prompts to use character limit and prevent meta-commentary (#979) 2025-10-04 15:44:00 -07:00
summarize_nodes.py Remove JSON indentation from prompts to reduce token usage (#985) 2025-10-06 16:08:43 -07:00