cognee/docs/ko/getting-started/installation.md
HectorSin d5544ccec1 fix: rename docs/kr to docs/ko to follow ISO 639-1 standard
Signed-off-by: HectorSin <kkang15634@ajou.ac.kr>
2026-01-14 13:50:05 +09:00

4.1 KiB
Raw Blame History

Installation

Set up your environment and install Cognee

Set up your environment and install Cognee to start building AI memory.

Python **3.9 3.12** is required to run Cognee.

Prerequisites

* We recommend creating a `.env` file in your project root * Cognee supports many configuration options, and a `.env` file keeps them organized You have two main options for configuring LLM and embedding providers:
**Option 1: OpenAI (Simplest)**

* Single API key handles both LLM and embeddings
* Uses gpt-4o-mini for LLM and text-embedding-3-small for embeddings by default
* Works out of the box with minimal configuration

**Option 2: Other Providers**

* Configure both LLM and embedding providers separately
* Supports Gemini, Anthropic, Ollama, and more
* Requires setting both `LLM_*` and `EMBEDDING_*` variables

<Info>
  By default, Cognee uses OpenAI for both LLMs and embeddings. If you change the LLM provider but don't configure embeddings, it will still default to OpenAI.
</Info>
* We recommend using [uv](https://github.com/astral-sh/uv) for virtual environment management * Run the following commands to create and activate a virtual environment:
```bash  theme={null}
uv venv && source .venv/bin/activate
```
* PostgreSQL database is required if you plan to use PostgreSQL as your relational database (requires `postgres` extra)

Setup

**Environment:** Add your OpenAI API key to your `.env` file:
  ```bash  theme={null}
  LLM_API_KEY="your_openai_api_key"
  ```

  **Installation:** Install Cognee with all extras:

  ```bash  theme={null}
  uv pip install cognee
  ```

  **What this gives you**: Cognee installed with default local databases (SQLite, LanceDB, Kuzu) — no external servers required.

  <Info>
    This single API key handles both LLM and embeddings. We use gpt-4o-mini for the LLM model and text-embedding-3-small for embeddings by default.
  </Info>
</Card>
**Environment:** Configure both LLM and embedding providers in your `.env` file. Here is an example for Gemini:
  ```bash  theme={null}
  # LLM
  LLM_PROVIDER="gemini"
  LLM_MODEL="gemini/gemini-flash-latest"
  LLM_API_KEY="your_gemini_api_key"

  # Embeddings
  EMBEDDING_PROVIDER="gemini"
  EMBEDDING_MODEL="gemini/text-embedding-004"
  EMBEDDING_API_KEY="your_gemini_api_key"
  ```

  <Info>
    Make sure to configure both LLM and embedding settings. If you only set one, the other will default to OpenAI.
  </Info>

  **Installation:** Install Cognee with provider-specific extras (`gemini`, `anthropic`, `ollama`, `mistral`, `huggingface`, or `groq`) for example:

  ```bash  theme={null}
  uv pip install cognee[gemini]
  ```

  **What this gives you**: Cognee installed with your chosen providers and default local databases.

  For detailed configuration options, see our [LLM](/setup-configuration/llm-providers) and [Embeddings](/setup-configuration/embedding-providers) guides.
</Card>

Next Steps

**Quickstart Tutorial**
Get started with Cognee by running your first knowledge graph example.
**Core Concepts**
Dive deeper into Cognee's powerful features and capabilities.

To find navigation and other pages in this documentation, fetch the llms.txt file at: https://docs.cognee.ai/llms.txt