diff --git a/README.md b/README.md
index 73c6aa898..e618d5bf9 100644
--- a/README.md
+++ b/README.md
@@ -125,16 +125,6 @@ os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
```
You can also set the variables by creating .env file, using our template.
-
-**Supported LLM Providers:** OpenAI (default), Anthropic, Gemini, Ollama, AWS Bedrock
-
-**For AWS Bedrock:** Set `LLM_PROVIDER="bedrock"` and use one of three authentication methods:
-- API Key: `LLM_API_KEY="your_bedrock_api_key"`
-- AWS Credentials: `AWS_ACCESS_KEY_ID` + `AWS_SECRET_ACCESS_KEY` (+ `AWS_SESSION_TOKEN` if needed)
-- AWS Profile: `AWS_PROFILE_NAME="your_profile"`
-
-Use an [inference profile](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html#API_runtime_InvokeModel_Example_5:~:text=Use%20an%20inference%20profile%20in%20model%20invocation) for the model IDs. This usually means appending `us.*` (or other region) to the model ID (e.g., `us.anthropic.claude-3-5-sonnet-20241022-v2:0`). See [AWS Bedrock models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html).
-
To use different LLM providers, for more info check out our documentation
diff --git a/cognee-starter-kit/README.md b/cognee-starter-kit/README.md
index 5a9369b89..c265e278e 100644
--- a/cognee-starter-kit/README.md
+++ b/cognee-starter-kit/README.md
@@ -25,14 +25,6 @@ uv sync
## Setup LLM
Add environment variables to `.env` file.
In case you choose to use OpenAI provider, add just the model and api_key.
-
-**Supported LLM Providers:**
-- OpenAI (default)
-- Anthropic
-- Gemini
-- Ollama
-- AWS Bedrock
-
```
LLM_PROVIDER=""
LLM_MODEL=""
@@ -47,36 +39,6 @@ EMBEDDING_API_KEY=""
EMBEDDING_API_VERSION=""
```
-**For AWS Bedrock, you have three authentication options:**
-
-1. **API Key (Bearer Token):**
-```
-LLM_PROVIDER="bedrock"
-LLM_API_KEY="your_bedrock_api_key"
-LLM_MODEL="us.anthropic.claude-3-5-sonnet-20241022-v2:0"
-AWS_REGION_NAME="us-east-1"
-```
-
-2. **AWS Credentials:**
-```
-LLM_PROVIDER="bedrock"
-LLM_MODEL="us.anthropic.claude-3-5-sonnet-20241022-v2:0"
-AWS_ACCESS_KEY_ID="your_aws_access_key"
-AWS_SECRET_ACCESS_KEY="your_aws_secret_key"
-[if needed] AWS_SESSION_TOKEN="your_session_token"
-AWS_REGION_NAME="us-east-1"
-```
-
-3. **AWS Profile:**
-```
-LLM_PROVIDER="bedrock"
-LLM_MODEL="us.anthropic.claude-3-5-sonnet-20241022-v2:0"
-AWS_PROFILE_NAME="your_aws_profile"
-AWS_REGION_NAME="us-east-1"
-```
-
-**Note:** For Bedrock models, use an [inference profile](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html#API_runtime_InvokeModel_Example_5:~:text=Use%20an%20inference%20profile%20in%20model%20invocation) for `LLM_MODEL`. This usually means appending `us.*` (or other region) to the model ID (e.g., `us.anthropic.claude-3-5-sonnet-20241022-v2:0`). See [AWS Bedrock models](https://docs.aws.amazon.com/bedrock/latest/userguide/models-supported.html) for available models.
-
Activate the Python environment:
```
source .venv/bin/activate