* Update cognify and the networkx client to prepare for running in Neo4j * Fix for openai model * Add the fix to the infra so that the models can be passed to the library. Enable llm_provider to be passed. * Auto graph generation now works with neo4j * Added fixes for both neo4j and networkx * Explicitly name semantic node connections * Added updated docs, readme, chunkers and updates to cognify * Make docs build trigger only when changes on it happen * Update docs, test git actions * Separate cognify logic into tasks * Introduce dspy knowledge graph extraction --------- Co-authored-by: Boris Arzentar <borisarzentar@gmail.com>
66 lines
1.3 KiB
Markdown
66 lines
1.3 KiB
Markdown
# Running cognee with local models
|
|
|
|
## 🚀 Getting Started with Local Models
|
|
|
|
You'll need to run the local model on your machine or use one of the providers hosting the model.
|
|
!!! note "We had some success with mixtral, but 7b models did not work well. We recommend using mixtral for now."
|
|
|
|
### Ollama
|
|
|
|
Set up Ollama by following instructions on [Ollama website](https://ollama.com/)
|
|
|
|
|
|
Set the environment variable to use the model
|
|
|
|
```bash
|
|
LLM_PROVIDER = 'ollama'
|
|
|
|
```
|
|
Otherwise, you can set the configuration for the model:
|
|
|
|
```bash
|
|
from cognee.infrastructure import infrastructure_config
|
|
infrastructure_config.set_config({
|
|
"llm_provider": 'ollama'
|
|
})
|
|
|
|
```
|
|
You can also set the HOST and model name:
|
|
|
|
```bash
|
|
|
|
CUSTOM_OLLAMA_ENDPOINT= "http://localhost:11434/v1"
|
|
CUSTOM_OLLAMA_MODEL = "mistral:instruct"
|
|
```
|
|
|
|
|
|
### Anyscale
|
|
|
|
```bash
|
|
LLM_PROVIDER = 'custom'
|
|
|
|
```
|
|
Otherwise, you can set the configuration for the model:
|
|
|
|
```bash
|
|
from cognee.infrastructure import infrastructure_config
|
|
infrastructure_config.set_config({
|
|
"llm_provider": 'custom'
|
|
})
|
|
|
|
```
|
|
You can also set the HOST and model name:
|
|
```bash
|
|
CUSTOM_LLM_MODEL = "mistralai/Mixtral-8x7B-Instruct-v0.1"
|
|
CUSTOM_ENDPOINT = "https://api.endpoints.anyscale.com/v1"
|
|
CUSTOM_LLM_API_KEY = "your_api_key"
|
|
```
|
|
|
|
You can set the same way HOST and model name for any other provider that has an API endpoint.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|