Remove documentation changes as requested by reviewers

- Reverted README.md to original state
- Reverted cognee-starter-kit/README.md to original state
- Documentation will be updated separately by maintainers
This commit is contained in:
xdurawa 2025-09-03 01:34:21 -04:00
parent 06a3458982
commit c91d1ff0ae
2 changed files with 0 additions and 48 deletions

View file

@ -125,16 +125,6 @@ os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
```
You can also set the variables by creating .env file, using our <a href="https://github.com/topoteretes/cognee/blob/main/.env.template">template.</a>
**Supported LLM Providers:** OpenAI (default), Anthropic, Gemini, Ollama, AWS Bedrock
**For AWS Bedrock:** Set `LLM_PROVIDER="bedrock"` and use one of three authentication methods:
- API Key: `LLM_API_KEY="your_bedrock_api_key"`
- AWS Credentials: `AWS_ACCESS_KEY_ID` + `AWS_SECRET_ACCESS_KEY` (+ `AWS_SESSION_TOKEN` if needed)
- AWS Profile: `AWS_PROFILE_NAME="your_profile"`
Use an [inference profile](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html#API_runtime_InvokeModel_Example_5:~:text=Use%20an%20inference%20profile%20in%20model%20invocation) for the model IDs. This usually means appending `us.*` (or other region) to the model ID (e.g., `us.anthropic.claude-3-5-sonnet-20241022-v2:0`). See [AWS Bedrock models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html).
To use different LLM providers, for more info check out our <a href="https://docs.cognee.ai">documentation</a>

View file

@ -25,14 +25,6 @@ uv sync
## Setup LLM
Add environment variables to `.env` file.
In case you choose to use OpenAI provider, add just the model and api_key.
**Supported LLM Providers:**
- OpenAI (default)
- Anthropic
- Gemini
- Ollama
- AWS Bedrock
```
LLM_PROVIDER=""
LLM_MODEL=""
@ -47,36 +39,6 @@ EMBEDDING_API_KEY=""
EMBEDDING_API_VERSION=""
```
**For AWS Bedrock, you have three authentication options:**
1. **API Key (Bearer Token):**
```
LLM_PROVIDER="bedrock"
LLM_API_KEY="your_bedrock_api_key"
LLM_MODEL="us.anthropic.claude-3-5-sonnet-20241022-v2:0"
AWS_REGION_NAME="us-east-1"
```
2. **AWS Credentials:**
```
LLM_PROVIDER="bedrock"
LLM_MODEL="us.anthropic.claude-3-5-sonnet-20241022-v2:0"
AWS_ACCESS_KEY_ID="your_aws_access_key"
AWS_SECRET_ACCESS_KEY="your_aws_secret_key"
[if needed] AWS_SESSION_TOKEN="your_session_token"
AWS_REGION_NAME="us-east-1"
```
3. **AWS Profile:**
```
LLM_PROVIDER="bedrock"
LLM_MODEL="us.anthropic.claude-3-5-sonnet-20241022-v2:0"
AWS_PROFILE_NAME="your_aws_profile"
AWS_REGION_NAME="us-east-1"
```
**Note:** For Bedrock models, use an [inference profile](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html#API_runtime_InvokeModel_Example_5:~:text=Use%20an%20inference%20profile%20in%20model%20invocation) for `LLM_MODEL`. This usually means appending `us.*` (or other region) to the model ID (e.g., `us.anthropic.claude-3-5-sonnet-20241022-v2:0`). See [AWS Bedrock models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html) for available models.
Activate the Python environment:
```
source .venv/bin/activate