Changes to `to_prompt_json()` helper to default to minified JSON (no indentation) instead of 2-space indentation. This reduces token consumption in LLM prompts while maintaining all necessary information. - Changed default `indent` parameter from `2` to `None` in `prompt_helpers.py` - Updated all prompt modules to remove explicit `indent=2` arguments - Minor code formatting fixes in LLM clients 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-authored-by: Claude <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| __init__.py | ||
| anthropic_client.py | ||
| azure_openai_client.py | ||
| client.py | ||
| config.py | ||
| errors.py | ||
| gemini_client.py | ||
| groq_client.py | ||
| openai_base_client.py | ||
| openai_client.py | ||
| openai_generic_client.py | ||
| utils.py | ||