cognee/docs/ko/setup-configuration/structured-output-backends.md
HectorSin d5544ccec1 fix: rename docs/kr to docs/ko to follow ISO 639-1 standard
Signed-off-by: HectorSin <kkang15634@ajou.ac.kr>
2026-01-14 13:50:05 +09:00

2.5 KiB

Structured Output Backends

Configure structured output frameworks for reliable data extraction in Cognee

Structured output backends ensure reliable data extraction from LLM responses. Cognee supports two frameworks that convert LLM text into structured Pydantic models for knowledge graph extraction and other tasks.

**New to configuration?**

See the Setup Configuration Overview for the complete workflow:

install extras → create .env → choose providers → handle pruning.

Supported Frameworks

Cognee supports two structured output approaches:

  • LiteLLM + Instructor — Provider-agnostic client with Pydantic coercion (default)
  • BAML — DSL-based framework with type registry and guardrails

Both frameworks produce the same Pydantic-validated outputs, so your application code remains unchanged regardless of which backend you choose.

How It Works

Cognee uses a unified interface that abstracts the underlying framework:

from cognee.infrastructure.llm.LLMGateway import LLMGateway
await LLMGateway.acreate_structured_output(text, system_prompt, response_model)

The STRUCTURED_OUTPUT_FRAMEWORK environment variable determines which backend processes your requests, but the API remains identical.

Configuration

```dotenv theme={null} STRUCTURED_OUTPUT_FRAMEWORK=instructor ``` ```dotenv theme={null} STRUCTURED_OUTPUT_FRAMEWORK=baml ```

Important Notes

  • Unified Interface: Your application code uses the same acreate_structured_output() call regardless of framework
  • Provider Flexibility: Both frameworks support the same LLM providers
  • Output Consistency: Both produce identical Pydantic-validated results
  • Performance: Framework choice doesn't significantly impact performance
Configure LLM providers for text generation Return to setup configuration overview Learn about custom prompt configuration

To find navigation and other pages in this documentation, fetch the llms.txt file at: https://docs.cognee.ai/llms.txt