Changes to `to_prompt_json()` helper to default to minified JSON (no indentation) instead of 2-space indentation. This reduces token consumption in LLM prompts while maintaining all necessary information. - Changed default `indent` parameter from `2` to `None` in `prompt_helpers.py` - Updated all prompt modules to remove explicit `indent=2` arguments - Minor code formatting fixes in LLM clients 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-authored-by: Claude <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| dedupe_edges.py | ||
| dedupe_nodes.py | ||
| eval.py | ||
| extract_edge_dates.py | ||
| extract_edges.py | ||
| extract_nodes.py | ||
| invalidate_edges.py | ||
| lib.py | ||
| models.py | ||
| prompt_helpers.py | ||
| snippets.py | ||
| summarize_nodes.py | ||