Merge branch 'dev' into COG-3146
This commit is contained in:
commit
f2166be823
156 changed files with 9750 additions and 5808 deletions
|
|
@ -3,7 +3,7 @@
|
||||||
language: en
|
language: en
|
||||||
early_access: false
|
early_access: false
|
||||||
enable_free_tier: true
|
enable_free_tier: true
|
||||||
reviews:
|
reviews:
|
||||||
profile: chill
|
profile: chill
|
||||||
instructions: >-
|
instructions: >-
|
||||||
# Code Review Instructions
|
# Code Review Instructions
|
||||||
|
|
@ -118,10 +118,10 @@ reviews:
|
||||||
- E117
|
- E117
|
||||||
- D208
|
- D208
|
||||||
line_length: 100
|
line_length: 100
|
||||||
dummy_variable_rgx: '^(_.*|junk|extra)$' # Variables starting with '_' or named 'junk' or 'extras', are considered dummy variables
|
dummy_variable_rgx: '^(_.*|junk|extra)$' # Variables starting with '_' or named 'junk' or 'extras', are considered dummy variables
|
||||||
markdownlint:
|
markdownlint:
|
||||||
enabled: true
|
enabled: true
|
||||||
yamllint:
|
yamllint:
|
||||||
enabled: true
|
enabled: true
|
||||||
chat:
|
chat:
|
||||||
auto_reply: true
|
auto_reply: true
|
||||||
|
|
|
||||||
|
|
@ -2,4 +2,8 @@
|
||||||
# Example:
|
# Example:
|
||||||
# CORS_ALLOWED_ORIGINS="https://yourdomain.com,https://another.com"
|
# CORS_ALLOWED_ORIGINS="https://yourdomain.com,https://another.com"
|
||||||
# For local development, you might use:
|
# For local development, you might use:
|
||||||
# CORS_ALLOWED_ORIGINS="http://localhost:3000"
|
# CORS_ALLOWED_ORIGINS="http://localhost:3000"
|
||||||
|
|
||||||
|
LLM_API_KEY="your-openai-api-key"
|
||||||
|
LLM_MODEL="openai/gpt-4o-mini"
|
||||||
|
LLM_PROVIDER="openai"
|
||||||
|
|
@ -28,4 +28,4 @@ secret-scan:
|
||||||
- path: 'docker-compose.yml'
|
- path: 'docker-compose.yml'
|
||||||
comment: 'Development docker compose with test credentials (neo4j/pleaseletmein, postgres cognee/cognee)'
|
comment: 'Development docker compose with test credentials (neo4j/pleaseletmein, postgres cognee/cognee)'
|
||||||
- path: 'deployment/helm/docker-compose-helm.yml'
|
- path: 'deployment/helm/docker-compose-helm.yml'
|
||||||
comment: 'Helm deployment docker compose with test postgres credentials (cognee/cognee)'
|
comment: 'Helm deployment docker compose with test postgres credentials (cognee/cognee)'
|
||||||
|
|
|
||||||
16
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
16
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
|
|
@ -8,7 +8,7 @@ body:
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
Thanks for taking the time to fill out this bug report! Please provide a clear and detailed description.
|
Thanks for taking the time to fill out this bug report! Please provide a clear and detailed description.
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: description
|
id: description
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -17,7 +17,7 @@ body:
|
||||||
placeholder: Describe the bug in detail...
|
placeholder: Describe the bug in detail...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: reproduction
|
id: reproduction
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -29,7 +29,7 @@ body:
|
||||||
3. See error...
|
3. See error...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: expected
|
id: expected
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -38,7 +38,7 @@ body:
|
||||||
placeholder: Describe what you expected...
|
placeholder: Describe what you expected...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: actual
|
id: actual
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -47,7 +47,7 @@ body:
|
||||||
placeholder: Describe what actually happened...
|
placeholder: Describe what actually happened...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: environment
|
id: environment
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -61,7 +61,7 @@ body:
|
||||||
- Database: [e.g. Neo4j]
|
- Database: [e.g. Neo4j]
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: logs
|
id: logs
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -71,7 +71,7 @@ body:
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: additional
|
id: additional
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -80,7 +80,7 @@ body:
|
||||||
placeholder: Any additional information...
|
placeholder: Any additional information...
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
|
|
|
||||||
13
.github/ISSUE_TEMPLATE/documentation.yml
vendored
13
.github/ISSUE_TEMPLATE/documentation.yml
vendored
|
|
@ -8,7 +8,7 @@ body:
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
Thanks for helping improve our documentation! Please provide details about the documentation issue or improvement.
|
Thanks for helping improve our documentation! Please provide details about the documentation issue or improvement.
|
||||||
|
|
||||||
- type: dropdown
|
- type: dropdown
|
||||||
id: doc-type
|
id: doc-type
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -22,7 +22,7 @@ body:
|
||||||
- New documentation request
|
- New documentation request
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: location
|
id: location
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -31,7 +31,7 @@ body:
|
||||||
placeholder: https://cognee.ai/docs/... or specific file/section
|
placeholder: https://cognee.ai/docs/... or specific file/section
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: issue
|
id: issue
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -40,7 +40,7 @@ body:
|
||||||
placeholder: The documentation is unclear about...
|
placeholder: The documentation is unclear about...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: suggestion
|
id: suggestion
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -49,7 +49,7 @@ body:
|
||||||
placeholder: I suggest changing this to...
|
placeholder: I suggest changing this to...
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: additional
|
id: additional
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -58,7 +58,7 @@ body:
|
||||||
placeholder: Additional context...
|
placeholder: Additional context...
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -71,4 +71,3 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I have specified the location of the documentation issue
|
- label: I have specified the location of the documentation issue
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
|
|
|
||||||
15
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
15
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
|
|
@ -8,7 +8,7 @@ body:
|
||||||
attributes:
|
attributes:
|
||||||
value: |
|
value: |
|
||||||
Thanks for suggesting a new feature! Please provide a clear and detailed description of your idea.
|
Thanks for suggesting a new feature! Please provide a clear and detailed description of your idea.
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: problem
|
id: problem
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -17,7 +17,7 @@ body:
|
||||||
placeholder: I'm always frustrated when...
|
placeholder: I'm always frustrated when...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: solution
|
id: solution
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -26,7 +26,7 @@ body:
|
||||||
placeholder: I would like to see...
|
placeholder: I would like to see...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: alternatives
|
id: alternatives
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -35,7 +35,7 @@ body:
|
||||||
placeholder: I have also considered...
|
placeholder: I have also considered...
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: use-case
|
id: use-case
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -44,7 +44,7 @@ body:
|
||||||
placeholder: This feature would help me...
|
placeholder: This feature would help me...
|
||||||
validations:
|
validations:
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: implementation
|
id: implementation
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -53,7 +53,7 @@ body:
|
||||||
placeholder: This could be implemented by...
|
placeholder: This could be implemented by...
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: additional
|
id: additional
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -62,7 +62,7 @@ body:
|
||||||
placeholder: Additional context...
|
placeholder: Additional context...
|
||||||
validations:
|
validations:
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
attributes:
|
attributes:
|
||||||
|
|
@ -75,4 +75,3 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I have described my specific use case
|
- label: I have described my specific use case
|
||||||
required: true
|
required: true
|
||||||
|
|
||||||
|
|
|
||||||
8
.github/actions/setup_neo4j/action.yml
vendored
8
.github/actions/setup_neo4j/action.yml
vendored
|
|
@ -34,14 +34,14 @@ runs:
|
||||||
-e NEO4J_apoc_export_file_enabled=true \
|
-e NEO4J_apoc_export_file_enabled=true \
|
||||||
-e NEO4J_apoc_import_file_enabled=true \
|
-e NEO4J_apoc_import_file_enabled=true \
|
||||||
neo4j:${{ inputs.neo4j-version }}
|
neo4j:${{ inputs.neo4j-version }}
|
||||||
|
|
||||||
- name: Wait for Neo4j to be ready
|
- name: Wait for Neo4j to be ready
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
echo "Waiting for Neo4j to start..."
|
echo "Waiting for Neo4j to start..."
|
||||||
timeout=60
|
timeout=60
|
||||||
counter=0
|
counter=0
|
||||||
|
|
||||||
while [ $counter -lt $timeout ]; do
|
while [ $counter -lt $timeout ]; do
|
||||||
if docker exec neo4j-test cypher-shell -u neo4j -p "${{ inputs.neo4j-password }}" "RETURN 1" > /dev/null 2>&1; then
|
if docker exec neo4j-test cypher-shell -u neo4j -p "${{ inputs.neo4j-password }}" "RETURN 1" > /dev/null 2>&1; then
|
||||||
echo "Neo4j is ready!"
|
echo "Neo4j is ready!"
|
||||||
|
|
@ -51,13 +51,13 @@ runs:
|
||||||
sleep 2
|
sleep 2
|
||||||
counter=$((counter + 2))
|
counter=$((counter + 2))
|
||||||
done
|
done
|
||||||
|
|
||||||
if [ $counter -ge $timeout ]; then
|
if [ $counter -ge $timeout ]; then
|
||||||
echo "Neo4j failed to start within $timeout seconds"
|
echo "Neo4j failed to start within $timeout seconds"
|
||||||
docker logs neo4j-test
|
docker logs neo4j-test
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
- name: Verify GDS is available
|
- name: Verify GDS is available
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
|
|
|
||||||
2
.github/core-team.txt
vendored
2
.github/core-team.txt
vendored
|
|
@ -8,5 +8,3 @@ lxobr
|
||||||
pazone
|
pazone
|
||||||
siillee
|
siillee
|
||||||
vasilije1990
|
vasilije1990
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
11
.github/pull_request_template.md
vendored
11
.github/pull_request_template.md
vendored
|
|
@ -10,26 +10,21 @@ DO NOT use AI-generated descriptions. We want to understand your thought process
|
||||||
<!--
|
<!--
|
||||||
* Key requirements to the new feature or modification;
|
* Key requirements to the new feature or modification;
|
||||||
* Proof that the changes work and meet the requirements;
|
* Proof that the changes work and meet the requirements;
|
||||||
* Include instructions on how to verify the changes. Describe how to test it locally;
|
|
||||||
* Proof that it's sufficiently tested.
|
|
||||||
-->
|
-->
|
||||||
|
|
||||||
## Type of Change
|
## Type of Change
|
||||||
<!-- Please check the relevant option -->
|
<!-- Please check the relevant option -->
|
||||||
- [ ] Bug fix (non-breaking change that fixes an issue)
|
- [ ] Bug fix (non-breaking change that fixes an issue)
|
||||||
- [ ] New feature (non-breaking change that adds functionality)
|
- [ ] New feature (non-breaking change that adds functionality)
|
||||||
- [ ] Breaking change (fix or feature that would cause existing functionality to change)
|
|
||||||
- [ ] Documentation update
|
|
||||||
- [ ] Code refactoring
|
- [ ] Code refactoring
|
||||||
- [ ] Performance improvement
|
|
||||||
- [ ] Other (please specify):
|
- [ ] Other (please specify):
|
||||||
|
|
||||||
## Screenshots/Videos (if applicable)
|
## Screenshots
|
||||||
<!-- Add screenshots or videos to help explain your changes -->
|
<!-- ADD SCREENSHOT OF LOCAL TESTS PASSING-->
|
||||||
|
|
||||||
## Pre-submission Checklist
|
## Pre-submission Checklist
|
||||||
<!-- Please check all boxes that apply before submitting your PR -->
|
<!-- Please check all boxes that apply before submitting your PR -->
|
||||||
- [ ] **I have tested my changes thoroughly before submitting this PR**
|
- [ ] **I have tested my changes thoroughly before submitting this PR** (See `CONTRIBUTING.md`)
|
||||||
- [ ] **This PR contains minimal changes necessary to address the issue/feature**
|
- [ ] **This PR contains minimal changes necessary to address the issue/feature**
|
||||||
- [ ] My code follows the project's coding standards and style guidelines
|
- [ ] My code follows the project's coding standards and style guidelines
|
||||||
- [ ] I have added tests that prove my fix is effective or that my feature works
|
- [ ] I have added tests that prove my fix is effective or that my feature works
|
||||||
|
|
|
||||||
2
.github/release-drafter.yml
vendored
2
.github/release-drafter.yml
vendored
|
|
@ -3,7 +3,7 @@ tag-template: 'v$NEXT_PATCH_VERSION'
|
||||||
|
|
||||||
categories:
|
categories:
|
||||||
- title: 'Features'
|
- title: 'Features'
|
||||||
labels: ['feature', 'enhancement']
|
labels: ['feature', 'enhancement']
|
||||||
- title: 'Bug Fixes'
|
- title: 'Bug Fixes'
|
||||||
labels: ['bug', 'fix']
|
labels: ['bug', 'fix']
|
||||||
- title: 'Maintenance'
|
- title: 'Maintenance'
|
||||||
|
|
|
||||||
37
.github/workflows/basic_tests.yml
vendored
37
.github/workflows/basic_tests.yml
vendored
|
|
@ -34,43 +34,6 @@ env:
|
||||||
ENV: 'dev'
|
ENV: 'dev'
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
|
|
||||||
lint:
|
|
||||||
name: Run Linting
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
steps:
|
|
||||||
- name: Check out repository
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Cognee Setup
|
|
||||||
uses: ./.github/actions/cognee_setup
|
|
||||||
with:
|
|
||||||
python-version: ${{ inputs.python-version }}
|
|
||||||
|
|
||||||
- name: Run Linting
|
|
||||||
uses: astral-sh/ruff-action@v2
|
|
||||||
|
|
||||||
format-check:
|
|
||||||
name: Run Formatting Check
|
|
||||||
runs-on: ubuntu-22.04
|
|
||||||
steps:
|
|
||||||
- name: Check out repository
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Cognee Setup
|
|
||||||
uses: ./.github/actions/cognee_setup
|
|
||||||
with:
|
|
||||||
python-version: ${{ inputs.python-version }}
|
|
||||||
|
|
||||||
- name: Run Formatting Check
|
|
||||||
uses: astral-sh/ruff-action@v2
|
|
||||||
with:
|
|
||||||
args: "format --check"
|
|
||||||
|
|
||||||
unit-tests:
|
unit-tests:
|
||||||
name: Run Unit Tests
|
name: Run Unit Tests
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
|
|
|
||||||
|
|
@ -31,54 +31,54 @@ WORKFLOWS=(
|
||||||
for workflow in "${WORKFLOWS[@]}"; do
|
for workflow in "${WORKFLOWS[@]}"; do
|
||||||
if [ -f "$workflow" ]; then
|
if [ -f "$workflow" ]; then
|
||||||
echo "Processing $workflow..."
|
echo "Processing $workflow..."
|
||||||
|
|
||||||
# Create a backup
|
# Create a backup
|
||||||
cp "$workflow" "${workflow}.bak"
|
cp "$workflow" "${workflow}.bak"
|
||||||
|
|
||||||
# Check if the file begins with a workflow_call trigger
|
# Check if the file begins with a workflow_call trigger
|
||||||
if grep -q "workflow_call:" "$workflow"; then
|
if grep -q "workflow_call:" "$workflow"; then
|
||||||
echo "$workflow already has workflow_call trigger, skipping..."
|
echo "$workflow already has workflow_call trigger, skipping..."
|
||||||
continue
|
continue
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Get the content after the 'on:' section
|
# Get the content after the 'on:' section
|
||||||
on_line=$(grep -n "^on:" "$workflow" | cut -d ':' -f1)
|
on_line=$(grep -n "^on:" "$workflow" | cut -d ':' -f1)
|
||||||
|
|
||||||
if [ -z "$on_line" ]; then
|
if [ -z "$on_line" ]; then
|
||||||
echo "Warning: No 'on:' section found in $workflow, skipping..."
|
echo "Warning: No 'on:' section found in $workflow, skipping..."
|
||||||
continue
|
continue
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Create a new file with the modified content
|
# Create a new file with the modified content
|
||||||
{
|
{
|
||||||
# Copy the part before 'on:'
|
# Copy the part before 'on:'
|
||||||
head -n $((on_line-1)) "$workflow"
|
head -n $((on_line-1)) "$workflow"
|
||||||
|
|
||||||
# Add the new on: section that only includes workflow_call
|
# Add the new on: section that only includes workflow_call
|
||||||
echo "on:"
|
echo "on:"
|
||||||
echo " workflow_call:"
|
echo " workflow_call:"
|
||||||
echo " secrets:"
|
echo " secrets:"
|
||||||
echo " inherit: true"
|
echo " inherit: true"
|
||||||
|
|
||||||
# Find where to continue after the original 'on:' section
|
# Find where to continue after the original 'on:' section
|
||||||
next_section=$(awk "NR > $on_line && /^[a-z]/ {print NR; exit}" "$workflow")
|
next_section=$(awk "NR > $on_line && /^[a-z]/ {print NR; exit}" "$workflow")
|
||||||
|
|
||||||
if [ -z "$next_section" ]; then
|
if [ -z "$next_section" ]; then
|
||||||
next_section=$(wc -l < "$workflow")
|
next_section=$(wc -l < "$workflow")
|
||||||
next_section=$((next_section+1))
|
next_section=$((next_section+1))
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Copy the rest of the file starting from the next section
|
# Copy the rest of the file starting from the next section
|
||||||
tail -n +$next_section "$workflow"
|
tail -n +$next_section "$workflow"
|
||||||
} > "${workflow}.new"
|
} > "${workflow}.new"
|
||||||
|
|
||||||
# Replace the original with the new version
|
# Replace the original with the new version
|
||||||
mv "${workflow}.new" "$workflow"
|
mv "${workflow}.new" "$workflow"
|
||||||
|
|
||||||
echo "Modified $workflow to only run when called from test-suites.yml"
|
echo "Modified $workflow to only run when called from test-suites.yml"
|
||||||
else
|
else
|
||||||
echo "Warning: $workflow not found, skipping..."
|
echo "Warning: $workflow not found, skipping..."
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
echo "Finished modifying workflows!"
|
echo "Finished modifying workflows!"
|
||||||
|
|
|
||||||
2
.github/workflows/dockerhub.yml
vendored
2
.github/workflows/dockerhub.yml
vendored
|
|
@ -45,4 +45,4 @@ jobs:
|
||||||
cache-to: type=registry,ref=cognee/cognee:buildcache,mode=max
|
cache-to: type=registry,ref=cognee/cognee:buildcache,mode=max
|
||||||
|
|
||||||
- name: Image digest
|
- name: Image digest
|
||||||
run: echo ${{ steps.build.outputs.digest }}
|
run: echo ${{ steps.build.outputs.digest }}
|
||||||
|
|
|
||||||
2
.github/workflows/label-core-team.yml
vendored
2
.github/workflows/label-core-team.yml
vendored
|
|
@ -72,5 +72,3 @@ jobs:
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
core.warning(`Failed to add label: ${error.message}`);
|
core.warning(`Failed to add label: ${error.message}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
2
.github/workflows/load_tests.yml
vendored
2
.github/workflows/load_tests.yml
vendored
|
|
@ -66,5 +66,3 @@ jobs:
|
||||||
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_S3_DEV_USER_KEY_ID }}
|
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_S3_DEV_USER_KEY_ID }}
|
||||||
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_S3_DEV_USER_SECRET_KEY }}
|
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_S3_DEV_USER_SECRET_KEY }}
|
||||||
run: uv run python ./cognee/tests/test_load.py
|
run: uv run python ./cognee/tests/test_load.py
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
7
.github/workflows/pre_test.yml
vendored
7
.github/workflows/pre_test.yml
vendored
|
|
@ -5,7 +5,7 @@ permissions:
|
||||||
contents: read
|
contents: read
|
||||||
jobs:
|
jobs:
|
||||||
check-uv-lock:
|
check-uv-lock:
|
||||||
name: Validate uv lockfile and project metadata
|
name: Lockfile and Pre-commit Hooks
|
||||||
runs-on: ubuntu-22.04
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
- name: Check out repository
|
- name: Check out repository
|
||||||
|
|
@ -17,6 +17,9 @@ jobs:
|
||||||
uses: astral-sh/setup-uv@v4
|
uses: astral-sh/setup-uv@v4
|
||||||
with:
|
with:
|
||||||
enable-cache: true
|
enable-cache: true
|
||||||
|
|
||||||
- name: Validate uv lockfile and project metadata
|
- name: Validate uv lockfile and project metadata
|
||||||
run: uv lock --check || { echo "'uv lock --check' failed."; echo "Run 'uv lock' and push your changes."; exit 1; }
|
run: uv lock --check || { echo "'uv lock --check' failed."; echo "Run 'uv lock' and push your changes."; exit 1; }
|
||||||
|
|
||||||
|
- name: Run pre-commit hooks
|
||||||
|
uses: pre-commit/action@v3.0.1
|
||||||
|
|
|
||||||
26
.github/workflows/release.yml
vendored
26
.github/workflows/release.yml
vendored
|
|
@ -42,10 +42,10 @@ jobs:
|
||||||
|
|
||||||
echo "tag=${TAG}" >> "$GITHUB_OUTPUT"
|
echo "tag=${TAG}" >> "$GITHUB_OUTPUT"
|
||||||
echo "version=${VERSION}" >> "$GITHUB_OUTPUT"
|
echo "version=${VERSION}" >> "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
git tag "${TAG}"
|
git tag "${TAG}"
|
||||||
git push origin "${TAG}"
|
git push origin "${TAG}"
|
||||||
|
|
||||||
|
|
||||||
- name: Create GitHub Release
|
- name: Create GitHub Release
|
||||||
uses: softprops/action-gh-release@v2
|
uses: softprops/action-gh-release@v2
|
||||||
|
|
@ -54,8 +54,8 @@ jobs:
|
||||||
generate_release_notes: true
|
generate_release_notes: true
|
||||||
env:
|
env:
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
release-pypi-package:
|
release-pypi-package:
|
||||||
needs: release-github
|
needs: release-github
|
||||||
name: Release PyPI Package from ${{ inputs.flavour }}
|
name: Release PyPI Package from ${{ inputs.flavour }}
|
||||||
permissions:
|
permissions:
|
||||||
|
|
@ -67,25 +67,25 @@ jobs:
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
ref: ${{ inputs.flavour }}
|
ref: ${{ inputs.flavour }}
|
||||||
|
|
||||||
- name: Install uv
|
- name: Install uv
|
||||||
uses: astral-sh/setup-uv@v7
|
uses: astral-sh/setup-uv@v7
|
||||||
|
|
||||||
- name: Install Python
|
- name: Install Python
|
||||||
run: uv python install
|
run: uv python install
|
||||||
|
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
run: uv sync --locked --all-extras
|
run: uv sync --locked --all-extras
|
||||||
|
|
||||||
- name: Build distributions
|
- name: Build distributions
|
||||||
run: uv build
|
run: uv build
|
||||||
|
|
||||||
- name: Publish ${{ inputs.flavour }} release to PyPI
|
- name: Publish ${{ inputs.flavour }} release to PyPI
|
||||||
env:
|
env:
|
||||||
UV_PUBLISH_TOKEN: ${{ secrets.PYPI_TOKEN }}
|
UV_PUBLISH_TOKEN: ${{ secrets.PYPI_TOKEN }}
|
||||||
run: uv publish
|
run: uv publish
|
||||||
|
|
||||||
release-docker-image:
|
release-docker-image:
|
||||||
needs: release-github
|
needs: release-github
|
||||||
name: Release Docker Image from ${{ inputs.flavour }}
|
name: Release Docker Image from ${{ inputs.flavour }}
|
||||||
permissions:
|
permissions:
|
||||||
|
|
@ -128,7 +128,7 @@ jobs:
|
||||||
context: .
|
context: .
|
||||||
platforms: linux/amd64,linux/arm64
|
platforms: linux/amd64,linux/arm64
|
||||||
push: true
|
push: true
|
||||||
tags: |
|
tags: |
|
||||||
cognee/cognee:${{ needs.release-github.outputs.version }}
|
cognee/cognee:${{ needs.release-github.outputs.version }}
|
||||||
cognee/cognee:latest
|
cognee/cognee:latest
|
||||||
labels: |
|
labels: |
|
||||||
|
|
@ -163,4 +163,4 @@ jobs:
|
||||||
-H "Authorization: Bearer ${{ secrets.REPO_DISPATCH_PAT_TOKEN }}" \
|
-H "Authorization: Bearer ${{ secrets.REPO_DISPATCH_PAT_TOKEN }}" \
|
||||||
-H "X-GitHub-Api-Version: 2022-11-28" \
|
-H "X-GitHub-Api-Version: 2022-11-28" \
|
||||||
https://api.github.com/repos/topoteretes/cognee-community/dispatches \
|
https://api.github.com/repos/topoteretes/cognee-community/dispatches \
|
||||||
-d '{"event_type":"new-main-release","client_payload":{"caller_repo":"'"${GITHUB_REPOSITORY}"'"}}'
|
-d '{"event_type":"new-main-release","client_payload":{"caller_repo":"'"${GITHUB_REPOSITORY}"'"}}'
|
||||||
|
|
|
||||||
1
.github/workflows/release_test.yml
vendored
1
.github/workflows/release_test.yml
vendored
|
|
@ -15,4 +15,3 @@ jobs:
|
||||||
name: Load Tests
|
name: Load Tests
|
||||||
uses: ./.github/workflows/load_tests.yml
|
uses: ./.github/workflows/load_tests.yml
|
||||||
secrets: inherit
|
secrets: inherit
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -10,7 +10,7 @@ on:
|
||||||
required: false
|
required: false
|
||||||
type: string
|
type: string
|
||||||
default: '["3.10.x", "3.12.x", "3.13.x"]'
|
default: '["3.10.x", "3.12.x", "3.13.x"]'
|
||||||
os:
|
os:
|
||||||
required: false
|
required: false
|
||||||
type: string
|
type: string
|
||||||
default: '["ubuntu-22.04", "macos-15", "windows-latest"]'
|
default: '["ubuntu-22.04", "macos-15", "windows-latest"]'
|
||||||
|
|
|
||||||
2
.github/workflows/test_llms.yml
vendored
2
.github/workflows/test_llms.yml
vendored
|
|
@ -173,4 +173,4 @@ jobs:
|
||||||
EMBEDDING_MODEL: "amazon.titan-embed-text-v2:0"
|
EMBEDDING_MODEL: "amazon.titan-embed-text-v2:0"
|
||||||
EMBEDDING_DIMENSIONS: "1024"
|
EMBEDDING_DIMENSIONS: "1024"
|
||||||
EMBEDDING_MAX_TOKENS: "8191"
|
EMBEDDING_MAX_TOKENS: "8191"
|
||||||
run: uv run python ./examples/python/simple_example.py
|
run: uv run python ./examples/python/simple_example.py
|
||||||
|
|
|
||||||
4
.github/workflows/test_suites.yml
vendored
4
.github/workflows/test_suites.yml
vendored
|
|
@ -18,11 +18,11 @@ env:
|
||||||
RUNTIME__LOG_LEVEL: ERROR
|
RUNTIME__LOG_LEVEL: ERROR
|
||||||
ENV: 'dev'
|
ENV: 'dev'
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
pre-test:
|
pre-test:
|
||||||
name: basic checks
|
name: basic checks
|
||||||
uses: ./.github/workflows/pre_test.yml
|
uses: ./.github/workflows/pre_test.yml
|
||||||
|
|
||||||
basic-tests:
|
basic-tests:
|
||||||
name: Basic Tests
|
name: Basic Tests
|
||||||
uses: ./.github/workflows/basic_tests.yml
|
uses: ./.github/workflows/basic_tests.yml
|
||||||
|
|
|
||||||
2
.gitignore
vendored
2
.gitignore
vendored
|
|
@ -147,6 +147,8 @@ venv/
|
||||||
ENV/
|
ENV/
|
||||||
env.bak/
|
env.bak/
|
||||||
venv.bak/
|
venv.bak/
|
||||||
|
mise.toml
|
||||||
|
deployment/helm/values-local.yml
|
||||||
|
|
||||||
# Spyder project settings
|
# Spyder project settings
|
||||||
.spyderproject
|
.spyderproject
|
||||||
|
|
|
||||||
|
|
@ -6,4 +6,4 @@ pull_request_rules:
|
||||||
actions:
|
actions:
|
||||||
backport:
|
backport:
|
||||||
branches:
|
branches:
|
||||||
- main
|
- main
|
||||||
|
|
|
||||||
|
|
@ -7,6 +7,7 @@ repos:
|
||||||
- id: trailing-whitespace
|
- id: trailing-whitespace
|
||||||
- id: end-of-file-fixer
|
- id: end-of-file-fixer
|
||||||
- id: check-yaml
|
- id: check-yaml
|
||||||
|
exclude: ^deployment/helm/templates/
|
||||||
- id: check-added-large-files
|
- id: check-added-large-files
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
# Ruff version.
|
# Ruff version.
|
||||||
|
|
|
||||||
|
|
@ -128,5 +128,3 @@ MCP server and Frontend:
|
||||||
## CI Mirrors Local Commands
|
## CI Mirrors Local Commands
|
||||||
|
|
||||||
Our GitHub Actions run the same ruff checks and pytest suites shown above (`.github/workflows/basic_tests.yml` and related workflows). Use the commands in this document locally to minimize CI surprises.
|
Our GitHub Actions run the same ruff checks and pytest suites shown above (`.github/workflows/basic_tests.yml` and related workflows). Use the commands in this document locally to minimize CI surprises.
|
||||||
|
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,16 +1,16 @@
|
||||||
> [!IMPORTANT]
|
> [!IMPORTANT]
|
||||||
> **Note for contributors:** When branching out, create a new branch from the `dev` branch.
|
> **Note for contributors:** When branching out, create a new branch from the `dev` branch.
|
||||||
|
|
||||||
# 🎉 Welcome to **cognee**!
|
# 🎉 Welcome to **cognee**!
|
||||||
|
|
||||||
We're excited that you're interested in contributing to our project!
|
We're excited that you're interested in contributing to our project!
|
||||||
We want to ensure that every user and contributor feels welcome, included and supported to participate in cognee community.
|
We want to ensure that every user and contributor feels welcome, included and supported to participate in cognee community.
|
||||||
This guide will help you get started and ensure your contributions can be efficiently integrated into the project.
|
This guide will help you get started and ensure your contributions can be efficiently integrated into the project.
|
||||||
|
|
||||||
## 🌟 Quick Links
|
## 🌟 Quick Links
|
||||||
|
|
||||||
- [Code of Conduct](CODE_OF_CONDUCT.md)
|
- [Code of Conduct](CODE_OF_CONDUCT.md)
|
||||||
- [Discord Community](https://discord.gg/bcy8xFAtfd)
|
- [Discord Community](https://discord.gg/bcy8xFAtfd)
|
||||||
- [Issue Tracker](https://github.com/topoteretes/cognee/issues)
|
- [Issue Tracker](https://github.com/topoteretes/cognee/issues)
|
||||||
- [Cognee Docs](https://docs.cognee.ai)
|
- [Cognee Docs](https://docs.cognee.ai)
|
||||||
|
|
||||||
|
|
@ -62,6 +62,11 @@ Looking for a place to start? Try filtering for [good first issues](https://gith
|
||||||
|
|
||||||
## 2. 🛠️ Development Setup
|
## 2. 🛠️ Development Setup
|
||||||
|
|
||||||
|
### Required tools
|
||||||
|
* [Python](https://www.python.org/downloads/)
|
||||||
|
* [uv](https://docs.astral.sh/uv/getting-started/installation/)
|
||||||
|
* pre-commit: `uv run pip install pre-commit && pre-commit install`
|
||||||
|
|
||||||
### Fork and Clone
|
### Fork and Clone
|
||||||
|
|
||||||
1. Fork the [**cognee**](https://github.com/topoteretes/cognee) repository
|
1. Fork the [**cognee**](https://github.com/topoteretes/cognee) repository
|
||||||
|
|
@ -93,14 +98,31 @@ git checkout -b feature/your-feature-name
|
||||||
4. **Commits**: Write clear commit messages
|
4. **Commits**: Write clear commit messages
|
||||||
|
|
||||||
### Running Tests
|
### Running Tests
|
||||||
|
|
||||||
|
Rename `.env.example` into `.env` and provide your OPENAI_API_KEY as LLM_API_KEY
|
||||||
|
|
||||||
```shell
|
```shell
|
||||||
python cognee/cognee/tests/test_library.py
|
uv run python cognee/tests/test_library.py
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running Simple Example
|
||||||
|
|
||||||
|
Rename `.env.example` into `.env` and provide your OPENAI_API_KEY as LLM_API_KEY
|
||||||
|
|
||||||
|
Make sure to run ```shell uv sync ``` in the root cloned folder or set up a virtual environment to run cognee
|
||||||
|
|
||||||
|
```shell
|
||||||
|
python examples/python/simple_example.py
|
||||||
|
```
|
||||||
|
or
|
||||||
|
|
||||||
|
```shell
|
||||||
|
uv run python examples/python/simple_example.py
|
||||||
```
|
```
|
||||||
|
|
||||||
## 4. 📤 Submitting Changes
|
## 4. 📤 Submitting Changes
|
||||||
|
|
||||||
1. Install ruff on your system
|
1. Make sure that `pre-commit` and hooks are installed. See `Required tools` section for more information. Try executing `pre-commit run` if you are not sure.
|
||||||
2. Run ```ruff format .``` and ``` ruff check ``` and fix the issues
|
|
||||||
3. Push your changes:
|
3. Push your changes:
|
||||||
```shell
|
```shell
|
||||||
git add .
|
git add .
|
||||||
|
|
|
||||||
|
|
@ -32,7 +32,7 @@ COPY README.md pyproject.toml uv.lock entrypoint.sh ./
|
||||||
|
|
||||||
# Install the project's dependencies using the lockfile and settings
|
# Install the project's dependencies using the lockfile and settings
|
||||||
RUN --mount=type=cache,target=/root/.cache/uv \
|
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||||
uv sync --extra debug --extra api --extra postgres --extra neo4j --extra llama-index --extra ollama --extra mistral --extra groq --extra anthropic --frozen --no-install-project --no-dev --no-editable
|
uv sync --extra debug --extra api --extra postgres --extra neo4j --extra llama-index --extra ollama --extra mistral --extra groq --extra anthropic --extra chromadb --frozen --no-install-project --no-dev --no-editable
|
||||||
|
|
||||||
# Copy Alembic configuration
|
# Copy Alembic configuration
|
||||||
COPY alembic.ini /app/alembic.ini
|
COPY alembic.ini /app/alembic.ini
|
||||||
|
|
@ -43,7 +43,7 @@ COPY alembic/ /app/alembic
|
||||||
COPY ./cognee /app/cognee
|
COPY ./cognee /app/cognee
|
||||||
COPY ./distributed /app/distributed
|
COPY ./distributed /app/distributed
|
||||||
RUN --mount=type=cache,target=/root/.cache/uv \
|
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||||
uv sync --extra debug --extra api --extra postgres --extra neo4j --extra llama-index --extra ollama --extra mistral --extra groq --extra anthropic --frozen --no-dev --no-editable
|
uv sync --extra debug --extra api --extra postgres --extra neo4j --extra llama-index --extra ollama --extra mistral --extra groq --extra anthropic --extra chromadb --frozen --no-dev --no-editable
|
||||||
|
|
||||||
FROM python:3.12-slim-bookworm
|
FROM python:3.12-slim-bookworm
|
||||||
|
|
||||||
|
|
|
||||||
12
README.md
12
README.md
|
|
@ -65,12 +65,12 @@ Use your data to build personalized and dynamic memory for AI Agents. Cognee let
|
||||||
|
|
||||||
## About Cognee
|
## About Cognee
|
||||||
|
|
||||||
Cognee is an open-source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your documents both searchable by meaning and connected by relationships.
|
Cognee is an open-source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your documents both searchable by meaning and connected by relationships.
|
||||||
|
|
||||||
You can use Cognee in two ways:
|
You can use Cognee in two ways:
|
||||||
|
|
||||||
1. [Self-host Cognee Open Source](https://docs.cognee.ai/getting-started/installation), which stores all data locally by default.
|
1. [Self-host Cognee Open Source](https://docs.cognee.ai/getting-started/installation), which stores all data locally by default.
|
||||||
2. [Connect to Cognee Cloud](https://platform.cognee.ai/), and get the same OSS stack on managed infrastructure for easier development and productionization.
|
2. [Connect to Cognee Cloud](https://platform.cognee.ai/), and get the same OSS stack on managed infrastructure for easier development and productionization.
|
||||||
|
|
||||||
### Cognee Open Source (self-hosted):
|
### Cognee Open Source (self-hosted):
|
||||||
|
|
||||||
|
|
@ -81,8 +81,8 @@ You can use Cognee in two ways:
|
||||||
- Offers high customizability through user-defined tasks, modular pipelines, and built-in search endpoints
|
- Offers high customizability through user-defined tasks, modular pipelines, and built-in search endpoints
|
||||||
|
|
||||||
### Cognee Cloud (managed):
|
### Cognee Cloud (managed):
|
||||||
- Hosted web UI dashboard
|
- Hosted web UI dashboard
|
||||||
- Automatic version updates
|
- Automatic version updates
|
||||||
- Resource usage analytics
|
- Resource usage analytics
|
||||||
- GDPR compliant, enterprise-grade security
|
- GDPR compliant, enterprise-grade security
|
||||||
|
|
||||||
|
|
@ -119,7 +119,7 @@ To integrate other LLM providers, see our [LLM Provider Documentation](https://d
|
||||||
|
|
||||||
### Step 3: Run the Pipeline
|
### Step 3: Run the Pipeline
|
||||||
|
|
||||||
Cognee will take your documents, generate a knowledge graph from them and then query the graph based on combined relationships.
|
Cognee will take your documents, generate a knowledge graph from them and then query the graph based on combined relationships.
|
||||||
|
|
||||||
Now, run a minimal pipeline:
|
Now, run a minimal pipeline:
|
||||||
|
|
||||||
|
|
@ -157,7 +157,7 @@ As you can see, the output is generated from the document we previously stored i
|
||||||
Cognee turns documents into AI memory.
|
Cognee turns documents into AI memory.
|
||||||
```
|
```
|
||||||
|
|
||||||
### Use the Cognee CLI
|
### Use the Cognee CLI
|
||||||
|
|
||||||
As an alternative, you can get started with these essential commands:
|
As an alternative, you can get started with these essential commands:
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1 +1 @@
|
||||||
Generic single-database configuration with an async dbapi.
|
Generic single-database configuration with an async dbapi.
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,52 @@
|
||||||
|
"""Enable delete for old tutorial notebooks
|
||||||
|
|
||||||
|
Revision ID: 1a58b986e6e1
|
||||||
|
Revises: 46a6ce2bd2b2
|
||||||
|
Create Date: 2025-12-17 11:04:44.414259
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "1a58b986e6e1"
|
||||||
|
down_revision: Union[str, None] = "e1ec1dcb50b6"
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def change_tutorial_deletable_flag(deletable: bool) -> None:
|
||||||
|
bind = op.get_bind()
|
||||||
|
inspector = sa.inspect(bind)
|
||||||
|
|
||||||
|
if "notebooks" not in inspector.get_table_names():
|
||||||
|
return
|
||||||
|
|
||||||
|
columns = {col["name"] for col in inspector.get_columns("notebooks")}
|
||||||
|
required_columns = {"name", "deletable"}
|
||||||
|
if not required_columns.issubset(columns):
|
||||||
|
return
|
||||||
|
|
||||||
|
notebooks = sa.table(
|
||||||
|
"notebooks",
|
||||||
|
sa.Column("name", sa.String()),
|
||||||
|
sa.Column("deletable", sa.Boolean()),
|
||||||
|
)
|
||||||
|
|
||||||
|
tutorial_name = "Python Development with Cognee Tutorial 🧠"
|
||||||
|
|
||||||
|
bind.execute(
|
||||||
|
notebooks.update().where(notebooks.c.name == tutorial_name).values(deletable=deletable)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
change_tutorial_deletable_flag(True)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
change_tutorial_deletable_flag(False)
|
||||||
|
|
@ -43,10 +43,10 @@ Saiba mais sobre os [casos de uso](https://docs.cognee.ai/use-cases) e [avaliaç
|
||||||
|
|
||||||
## Funcionalidades
|
## Funcionalidades
|
||||||
|
|
||||||
- Conecte e recupere suas conversas passadas, documentos, imagens e transcrições de áudio
|
- Conecte e recupere suas conversas passadas, documentos, imagens e transcrições de áudio
|
||||||
- Reduza alucinações, esforço de desenvolvimento e custos
|
- Reduza alucinações, esforço de desenvolvimento e custos
|
||||||
- Carregue dados em bancos de dados de grafos e vetores usando apenas Pydantic
|
- Carregue dados em bancos de dados de grafos e vetores usando apenas Pydantic
|
||||||
- Transforme e organize seus dados enquanto os coleta de mais de 30 fontes diferentes
|
- Transforme e organize seus dados enquanto os coleta de mais de 30 fontes diferentes
|
||||||
|
|
||||||
## Primeiros Passos
|
## Primeiros Passos
|
||||||
|
|
||||||
|
|
@ -108,7 +108,7 @@ if __name__ == '__main__':
|
||||||
Exemplo do output:
|
Exemplo do output:
|
||||||
```
|
```
|
||||||
O Processamento de Linguagem Natural (NLP) é um campo interdisciplinar e transdisciplinar que envolve ciência da computação e recuperação de informações. Ele se concentra na interação entre computadores e a linguagem humana, permitindo que as máquinas compreendam e processem a linguagem natural.
|
O Processamento de Linguagem Natural (NLP) é um campo interdisciplinar e transdisciplinar que envolve ciência da computação e recuperação de informações. Ele se concentra na interação entre computadores e a linguagem humana, permitindo que as máquinas compreendam e processem a linguagem natural.
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Visualização do grafo:
|
Visualização do grafo:
|
||||||
|
|
|
||||||
|
|
@ -141,7 +141,7 @@ if __name__ == '__main__':
|
||||||
2. Простая демонстрация GraphRAG
|
2. Простая демонстрация GraphRAG
|
||||||
[Видео](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f)
|
[Видео](https://github.com/user-attachments/assets/d80b0776-4eb9-4b8e-aa22-3691e2d44b8f)
|
||||||
|
|
||||||
3. Cognee с Ollama
|
3. Cognee с Ollama
|
||||||
[Видео](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db)
|
[Видео](https://github.com/user-attachments/assets/8621d3e8-ecb8-4860-afb2-5594f2ee17db)
|
||||||
|
|
||||||
## Правила поведения
|
## Правила поведения
|
||||||
|
|
|
||||||
|
|
@ -114,7 +114,7 @@ if __name__ == '__main__':
|
||||||
示例输出:
|
示例输出:
|
||||||
```
|
```
|
||||||
自然语言处理(NLP)是计算机科学和信息检索的跨学科领域。它关注计算机和人类语言之间的交互,使机器能够理解和处理自然语言。
|
自然语言处理(NLP)是计算机科学和信息检索的跨学科领域。它关注计算机和人类语言之间的交互,使机器能够理解和处理自然语言。
|
||||||
|
|
||||||
```
|
```
|
||||||
图形可视化:
|
图形可视化:
|
||||||
<a href="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.html"><img src="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.png" width="100%" alt="图形可视化"></a>
|
<a href="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.html"><img src="https://rawcdn.githack.com/topoteretes/cognee/refs/heads/main/assets/graph_visualization.png" width="100%" alt="图形可视化"></a>
|
||||||
|
|
|
||||||
1286
cognee-frontend/package-lock.json
generated
1286
cognee-frontend/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
|
@ -9,14 +9,15 @@
|
||||||
"lint": "next lint"
|
"lint": "next lint"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@auth0/nextjs-auth0": "^4.13.1",
|
"@auth0/nextjs-auth0": "^4.14.0",
|
||||||
"classnames": "^2.5.1",
|
"classnames": "^2.5.1",
|
||||||
"culori": "^4.0.1",
|
"culori": "^4.0.1",
|
||||||
"d3-force-3d": "^3.0.6",
|
"d3-force-3d": "^3.0.6",
|
||||||
"next": "16.0.4",
|
"next": "^16.1.7",
|
||||||
"react": "^19.2.0",
|
"react": "^19.2.3",
|
||||||
"react-dom": "^19.2.0",
|
"react-dom": "^19.2.3",
|
||||||
"react-force-graph-2d": "^1.27.1",
|
"react-force-graph-2d": "^1.27.1",
|
||||||
|
"react-markdown": "^10.1.0",
|
||||||
"uuid": "^9.0.1"
|
"uuid": "^9.0.1"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
|
|
|
||||||
|
|
@ -55,7 +55,7 @@ export default function CogneeAddWidget({ onData, useCloud = false }: CogneeAddW
|
||||||
setTrue: setProcessingFilesInProgress,
|
setTrue: setProcessingFilesInProgress,
|
||||||
setFalse: setProcessingFilesDone,
|
setFalse: setProcessingFilesDone,
|
||||||
} = useBoolean(false);
|
} = useBoolean(false);
|
||||||
|
|
||||||
const handleAddFiles = (dataset: Dataset, event: ChangeEvent<HTMLInputElement>) => {
|
const handleAddFiles = (dataset: Dataset, event: ChangeEvent<HTMLInputElement>) => {
|
||||||
event.stopPropagation();
|
event.stopPropagation();
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -111,7 +111,7 @@ export default function GraphControls({ data, isAddNodeFormOpen, onGraphShapeCha
|
||||||
|
|
||||||
const [isAuthShapeChangeEnabled, setIsAuthShapeChangeEnabled] = useState(true);
|
const [isAuthShapeChangeEnabled, setIsAuthShapeChangeEnabled] = useState(true);
|
||||||
const shapeChangeTimeout = useRef<number | null>(null);
|
const shapeChangeTimeout = useRef<number | null>(null);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
onGraphShapeChange(DEFAULT_GRAPH_SHAPE);
|
onGraphShapeChange(DEFAULT_GRAPH_SHAPE);
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -57,7 +57,7 @@ export default function GraphVisualization({ ref, data, graphControls, className
|
||||||
// Initial size calculation
|
// Initial size calculation
|
||||||
handleResize();
|
handleResize();
|
||||||
|
|
||||||
// ResizeObserver
|
// ResizeObserver
|
||||||
const resizeObserver = new ResizeObserver(() => {
|
const resizeObserver = new ResizeObserver(() => {
|
||||||
handleResize();
|
handleResize();
|
||||||
});
|
});
|
||||||
|
|
@ -216,7 +216,7 @@ export default function GraphVisualization({ ref, data, graphControls, className
|
||||||
}, [data, graphRef]);
|
}, [data, graphRef]);
|
||||||
|
|
||||||
const [graphShape, setGraphShape] = useState<string>();
|
const [graphShape, setGraphShape] = useState<string>();
|
||||||
|
|
||||||
const zoomToFit: ForceGraphMethods["zoomToFit"] = (
|
const zoomToFit: ForceGraphMethods["zoomToFit"] = (
|
||||||
durationMs?: number,
|
durationMs?: number,
|
||||||
padding?: number,
|
padding?: number,
|
||||||
|
|
@ -227,15 +227,15 @@ export default function GraphVisualization({ ref, data, graphControls, className
|
||||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
return undefined as any;
|
return undefined as any;
|
||||||
}
|
}
|
||||||
|
|
||||||
return graphRef.current.zoomToFit?.(durationMs, padding, nodeFilter);
|
return graphRef.current.zoomToFit?.(durationMs, padding, nodeFilter);
|
||||||
};
|
};
|
||||||
|
|
||||||
useImperativeHandle(ref, () => ({
|
useImperativeHandle(ref, () => ({
|
||||||
zoomToFit,
|
zoomToFit,
|
||||||
setGraphShape,
|
setGraphShape,
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div ref={containerRef} className={classNames("w-full h-full", className)} id="graph-container">
|
<div ref={containerRef} className={classNames("w-full h-full", className)} id="graph-container">
|
||||||
|
|
|
||||||
|
|
@ -1373,4 +1373,4 @@
|
||||||
"padding": 20
|
"padding": 20
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -15,6 +15,8 @@ import AddDataToCognee from "./AddDataToCognee";
|
||||||
import NotebooksAccordion from "./NotebooksAccordion";
|
import NotebooksAccordion from "./NotebooksAccordion";
|
||||||
import CogneeInstancesAccordion from "./CogneeInstancesAccordion";
|
import CogneeInstancesAccordion from "./CogneeInstancesAccordion";
|
||||||
import InstanceDatasetsAccordion from "./InstanceDatasetsAccordion";
|
import InstanceDatasetsAccordion from "./InstanceDatasetsAccordion";
|
||||||
|
import cloudFetch from "@/modules/instances/cloudFetch";
|
||||||
|
import localFetch from "@/modules/instances/localFetch";
|
||||||
|
|
||||||
interface DashboardProps {
|
interface DashboardProps {
|
||||||
user?: {
|
user?: {
|
||||||
|
|
@ -26,6 +28,17 @@ interface DashboardProps {
|
||||||
accessToken: string;
|
accessToken: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const cogneeInstances = {
|
||||||
|
cloudCognee: {
|
||||||
|
name: "CloudCognee",
|
||||||
|
fetch: cloudFetch,
|
||||||
|
},
|
||||||
|
localCognee: {
|
||||||
|
name: "LocalCognee",
|
||||||
|
fetch: localFetch,
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
export default function Dashboard({ accessToken }: DashboardProps) {
|
export default function Dashboard({ accessToken }: DashboardProps) {
|
||||||
fetch.setAccessToken(accessToken);
|
fetch.setAccessToken(accessToken);
|
||||||
const { user } = useAuthenticatedUser();
|
const { user } = useAuthenticatedUser();
|
||||||
|
|
@ -38,7 +51,7 @@ export default function Dashboard({ accessToken }: DashboardProps) {
|
||||||
updateNotebook,
|
updateNotebook,
|
||||||
saveNotebook,
|
saveNotebook,
|
||||||
removeNotebook,
|
removeNotebook,
|
||||||
} = useNotebooks();
|
} = useNotebooks(cogneeInstances.localCognee);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!notebooks.length) {
|
if (!notebooks.length) {
|
||||||
|
|
|
||||||
|
|
@ -134,7 +134,7 @@ export default function DatasetsAccordion({
|
||||||
} = useBoolean(false);
|
} = useBoolean(false);
|
||||||
|
|
||||||
const [datasetToRemove, setDatasetToRemove] = useState<Dataset | null>(null);
|
const [datasetToRemove, setDatasetToRemove] = useState<Dataset | null>(null);
|
||||||
|
|
||||||
const handleDatasetRemove = (dataset: Dataset) => {
|
const handleDatasetRemove = (dataset: Dataset) => {
|
||||||
setDatasetToRemove(dataset);
|
setDatasetToRemove(dataset);
|
||||||
openRemoveDatasetModal();
|
openRemoveDatasetModal();
|
||||||
|
|
|
||||||
|
|
@ -3,6 +3,7 @@ import { useCallback, useEffect } from "react";
|
||||||
|
|
||||||
import { fetch, isCloudEnvironment, useBoolean } from "@/utils";
|
import { fetch, isCloudEnvironment, useBoolean } from "@/utils";
|
||||||
import { checkCloudConnection } from "@/modules/cloud";
|
import { checkCloudConnection } from "@/modules/cloud";
|
||||||
|
import { setApiKey } from "@/modules/instances/cloudFetch";
|
||||||
import { CaretIcon, CloseIcon, CloudIcon, LocalCogneeIcon } from "@/ui/Icons";
|
import { CaretIcon, CloseIcon, CloudIcon, LocalCogneeIcon } from "@/ui/Icons";
|
||||||
import { CTAButton, GhostButton, IconButton, Input, Modal } from "@/ui/elements";
|
import { CTAButton, GhostButton, IconButton, Input, Modal } from "@/ui/elements";
|
||||||
|
|
||||||
|
|
@ -24,6 +25,7 @@ export default function InstanceDatasetsAccordion({ onDatasetsChange }: Instance
|
||||||
const checkConnectionToCloudCognee = useCallback((apiKey?: string) => {
|
const checkConnectionToCloudCognee = useCallback((apiKey?: string) => {
|
||||||
if (apiKey) {
|
if (apiKey) {
|
||||||
fetch.setApiKey(apiKey);
|
fetch.setApiKey(apiKey);
|
||||||
|
setApiKey(apiKey);
|
||||||
}
|
}
|
||||||
return checkCloudConnection()
|
return checkCloudConnection()
|
||||||
.then(setCloudCogneeConnected)
|
.then(setCloudCogneeConnected)
|
||||||
|
|
|
||||||
|
|
@ -45,7 +45,7 @@ export default function Plan() {
|
||||||
<div className="bg-white rounded-xl px-5 py-5 mb-2">
|
<div className="bg-white rounded-xl px-5 py-5 mb-2">
|
||||||
Affordable and transparent pricing
|
Affordable and transparent pricing
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="grid grid-cols-3 gap-x-2.5">
|
<div className="grid grid-cols-3 gap-x-2.5">
|
||||||
<div className="pt-13 py-4 px-5 mb-2.5 rounded-tl-xl rounded-tr-xl bg-white h-full">
|
<div className="pt-13 py-4 px-5 mb-2.5 rounded-tl-xl rounded-tr-xl bg-white h-full">
|
||||||
<div>Basic</div>
|
<div>Basic</div>
|
||||||
|
|
|
||||||
|
|
@ -40,7 +40,7 @@ export default function useChat(dataset: Dataset) {
|
||||||
setTrue: disableSearchRun,
|
setTrue: disableSearchRun,
|
||||||
setFalse: enableSearchRun,
|
setFalse: enableSearchRun,
|
||||||
} = useBoolean(false);
|
} = useBoolean(false);
|
||||||
|
|
||||||
const refreshChat = useCallback(async () => {
|
const refreshChat = useCallback(async () => {
|
||||||
const data = await fetchMessages();
|
const data = await fetchMessages();
|
||||||
return setMessages(data);
|
return setMessages(data);
|
||||||
|
|
|
||||||
|
|
@ -46,7 +46,7 @@ function useDatasets(useCloud = false) {
|
||||||
// checkDatasetStatuses(datasets);
|
// checkDatasetStatuses(datasets);
|
||||||
// }, 50000);
|
// }, 50000);
|
||||||
// }, [fetchDatasetStatuses]);
|
// }, [fetchDatasetStatuses]);
|
||||||
|
|
||||||
// useEffect(() => {
|
// useEffect(() => {
|
||||||
// return () => {
|
// return () => {
|
||||||
// if (statusTimeout.current !== null) {
|
// if (statusTimeout.current !== null) {
|
||||||
|
|
@ -95,6 +95,7 @@ function useDatasets(useCloud = false) {
|
||||||
})
|
})
|
||||||
.catch((error) => {
|
.catch((error) => {
|
||||||
console.error('Error fetching datasets:', error);
|
console.error('Error fetching datasets:', error);
|
||||||
|
throw error;
|
||||||
});
|
});
|
||||||
}, [useCloud]);
|
}, [useCloud]);
|
||||||
|
|
||||||
|
|
|
||||||
59
cognee-frontend/src/modules/instances/cloudFetch.ts
Normal file
59
cognee-frontend/src/modules/instances/cloudFetch.ts
Normal file
|
|
@ -0,0 +1,59 @@
|
||||||
|
import handleServerErrors from "@/utils/handleServerErrors";
|
||||||
|
|
||||||
|
// let numberOfRetries = 0;
|
||||||
|
|
||||||
|
const cloudApiUrl = process.env.NEXT_PUBLIC_CLOUD_API_URL || "http://localhost:8001";
|
||||||
|
|
||||||
|
let apiKey: string | null = process.env.NEXT_PUBLIC_COGWIT_API_KEY || null;
|
||||||
|
|
||||||
|
export function setApiKey(newApiKey: string) {
|
||||||
|
apiKey = newApiKey;
|
||||||
|
};
|
||||||
|
|
||||||
|
export default async function cloudFetch(url: URL | RequestInfo, options: RequestInit = {}): Promise<Response> {
|
||||||
|
// function retry(lastError: Response) {
|
||||||
|
// if (numberOfRetries >= 1) {
|
||||||
|
// return Promise.reject(lastError);
|
||||||
|
// }
|
||||||
|
|
||||||
|
// numberOfRetries += 1;
|
||||||
|
|
||||||
|
// return global.fetch("/auth/token")
|
||||||
|
// .then(() => {
|
||||||
|
// return fetch(url, options);
|
||||||
|
// });
|
||||||
|
// }
|
||||||
|
|
||||||
|
const authHeaders = {
|
||||||
|
"Authorization": `X-Api-Key ${apiKey}`,
|
||||||
|
};
|
||||||
|
|
||||||
|
return global.fetch(
|
||||||
|
cloudApiUrl + "/api" + (typeof url === "string" ? url : url.toString()).replace("/v1", ""),
|
||||||
|
{
|
||||||
|
...options,
|
||||||
|
headers: {
|
||||||
|
...options.headers,
|
||||||
|
...authHeaders,
|
||||||
|
} as HeadersInit,
|
||||||
|
credentials: "include",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
.then((response) => handleServerErrors(response, null, true))
|
||||||
|
.catch((error) => {
|
||||||
|
if (error.message === "NEXT_REDIRECT") {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.detail === undefined) {
|
||||||
|
return Promise.reject(
|
||||||
|
new Error("No connection to the server.")
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return Promise.reject(error);
|
||||||
|
});
|
||||||
|
// .finally(() => {
|
||||||
|
// numberOfRetries = 0;
|
||||||
|
// });
|
||||||
|
}
|
||||||
27
cognee-frontend/src/modules/instances/localFetch.ts
Normal file
27
cognee-frontend/src/modules/instances/localFetch.ts
Normal file
|
|
@ -0,0 +1,27 @@
|
||||||
|
import handleServerErrors from "@/utils/handleServerErrors";
|
||||||
|
|
||||||
|
const localApiUrl = process.env.NEXT_PUBLIC_LOCAL_API_URL || "http://localhost:8000";
|
||||||
|
|
||||||
|
export default async function localFetch(url: URL | RequestInfo, options: RequestInit = {}): Promise<Response> {
|
||||||
|
return global.fetch(
|
||||||
|
localApiUrl + "/api" + (typeof url === "string" ? url : url.toString()),
|
||||||
|
{
|
||||||
|
...options,
|
||||||
|
credentials: "include",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
.then((response) => handleServerErrors(response, null, false))
|
||||||
|
.catch((error) => {
|
||||||
|
if (error.message === "NEXT_REDIRECT") {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.detail === undefined) {
|
||||||
|
return Promise.reject(
|
||||||
|
new Error("No connection to the server.")
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return Promise.reject(error);
|
||||||
|
});
|
||||||
|
}
|
||||||
4
cognee-frontend/src/modules/instances/types.ts
Normal file
4
cognee-frontend/src/modules/instances/types.ts
Normal file
|
|
@ -0,0 +1,4 @@
|
||||||
|
export interface CogneeInstance {
|
||||||
|
name: string;
|
||||||
|
fetch: typeof global.fetch;
|
||||||
|
}
|
||||||
13
cognee-frontend/src/modules/notebooks/createNotebook.ts
Normal file
13
cognee-frontend/src/modules/notebooks/createNotebook.ts
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
import { CogneeInstance } from "@/modules/instances/types";
|
||||||
|
|
||||||
|
export default function createNotebook(notebookName: string, instance: CogneeInstance) {
|
||||||
|
return instance.fetch("/v1/notebooks/", {
|
||||||
|
body: JSON.stringify({ name: notebookName }),
|
||||||
|
method: "POST",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
}).then((response: Response) =>
|
||||||
|
response.ok ? response.json() : Promise.reject(response)
|
||||||
|
);
|
||||||
|
}
|
||||||
7
cognee-frontend/src/modules/notebooks/deleteNotebook.ts
Normal file
7
cognee-frontend/src/modules/notebooks/deleteNotebook.ts
Normal file
|
|
@ -0,0 +1,7 @@
|
||||||
|
import { CogneeInstance } from "@/modules/instances/types";
|
||||||
|
|
||||||
|
export default function deleteNotebook(notebookId: string, instance: CogneeInstance) {
|
||||||
|
return instance.fetch(`/v1/notebooks/${notebookId}`, {
|
||||||
|
method: "DELETE",
|
||||||
|
});
|
||||||
|
}
|
||||||
12
cognee-frontend/src/modules/notebooks/getNotebooks.ts
Normal file
12
cognee-frontend/src/modules/notebooks/getNotebooks.ts
Normal file
|
|
@ -0,0 +1,12 @@
|
||||||
|
import { CogneeInstance } from "@/modules/instances/types";
|
||||||
|
|
||||||
|
export default function getNotebooks(instance: CogneeInstance) {
|
||||||
|
return instance.fetch("/v1/notebooks/", {
|
||||||
|
method: "GET",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
}).then((response: Response) =>
|
||||||
|
response.ok ? response.json() : Promise.reject(response)
|
||||||
|
);
|
||||||
|
}
|
||||||
14
cognee-frontend/src/modules/notebooks/runNotebookCell.ts
Normal file
14
cognee-frontend/src/modules/notebooks/runNotebookCell.ts
Normal file
|
|
@ -0,0 +1,14 @@
|
||||||
|
import { Cell } from "@/ui/elements/Notebook/types";
|
||||||
|
import { CogneeInstance } from "@/modules/instances/types";
|
||||||
|
|
||||||
|
export default function runNotebookCell(notebookId: string, cell: Cell, instance: CogneeInstance) {
|
||||||
|
return instance.fetch(`/v1/notebooks/${notebookId}/${cell.id}/run`, {
|
||||||
|
body: JSON.stringify({
|
||||||
|
content: cell.content,
|
||||||
|
}),
|
||||||
|
method: "POST",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
}).then((response: Response) => response.json());
|
||||||
|
}
|
||||||
13
cognee-frontend/src/modules/notebooks/saveNotebook.ts
Normal file
13
cognee-frontend/src/modules/notebooks/saveNotebook.ts
Normal file
|
|
@ -0,0 +1,13 @@
|
||||||
|
import { CogneeInstance } from "@/modules/instances/types";
|
||||||
|
|
||||||
|
export default function saveNotebook(notebookId: string, notebookData: object, instance: CogneeInstance) {
|
||||||
|
return instance.fetch(`/v1/notebooks/${notebookId}`, {
|
||||||
|
body: JSON.stringify(notebookData),
|
||||||
|
method: "PUT",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
}).then((response: Response) =>
|
||||||
|
response.ok ? response.json() : Promise.reject(response)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
@ -1,20 +1,18 @@
|
||||||
import { useCallback, useState } from "react";
|
import { useCallback, useState } from "react";
|
||||||
import { fetch, isCloudEnvironment } from "@/utils";
|
|
||||||
import { Cell, Notebook } from "@/ui/elements/Notebook/types";
|
import { Cell, Notebook } from "@/ui/elements/Notebook/types";
|
||||||
|
import { CogneeInstance } from "@/modules/instances/types";
|
||||||
|
import createNotebook from "./createNotebook";
|
||||||
|
import deleteNotebook from "./deleteNotebook";
|
||||||
|
import getNotebooks from "./getNotebooks";
|
||||||
|
import runNotebookCell from "./runNotebookCell";
|
||||||
|
import { default as persistNotebook } from "./saveNotebook";
|
||||||
|
|
||||||
function useNotebooks() {
|
function useNotebooks(instance: CogneeInstance) {
|
||||||
const [notebooks, setNotebooks] = useState<Notebook[]>([]);
|
const [notebooks, setNotebooks] = useState<Notebook[]>([]);
|
||||||
|
|
||||||
const addNotebook = useCallback((notebookName: string) => {
|
const addNotebook = useCallback((notebookName: string) => {
|
||||||
return fetch("/v1/notebooks", {
|
return createNotebook(notebookName, instance)
|
||||||
body: JSON.stringify({ name: notebookName }),
|
.then((notebook: Notebook) => {
|
||||||
method: "POST",
|
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
}, isCloudEnvironment())
|
|
||||||
.then((response) => response.json())
|
|
||||||
.then((notebook) => {
|
|
||||||
setNotebooks((notebooks) => [
|
setNotebooks((notebooks) => [
|
||||||
...notebooks,
|
...notebooks,
|
||||||
notebook,
|
notebook,
|
||||||
|
|
@ -22,36 +20,29 @@ function useNotebooks() {
|
||||||
|
|
||||||
return notebook;
|
return notebook;
|
||||||
});
|
});
|
||||||
}, []);
|
}, [instance]);
|
||||||
|
|
||||||
const removeNotebook = useCallback((notebookId: string) => {
|
const removeNotebook = useCallback((notebookId: string) => {
|
||||||
return fetch(`/v1/notebooks/${notebookId}`, {
|
return deleteNotebook(notebookId, instance)
|
||||||
method: "DELETE",
|
|
||||||
}, isCloudEnvironment())
|
|
||||||
.then(() => {
|
.then(() => {
|
||||||
setNotebooks((notebooks) =>
|
setNotebooks((notebooks) =>
|
||||||
notebooks.filter((notebook) => notebook.id !== notebookId)
|
notebooks.filter((notebook) => notebook.id !== notebookId)
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
}, []);
|
}, [instance]);
|
||||||
|
|
||||||
const fetchNotebooks = useCallback(() => {
|
const fetchNotebooks = useCallback(() => {
|
||||||
return fetch("/v1/notebooks", {
|
return getNotebooks(instance)
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
}, isCloudEnvironment())
|
|
||||||
.then((response) => response.json())
|
|
||||||
.then((notebooks) => {
|
.then((notebooks) => {
|
||||||
setNotebooks(notebooks);
|
setNotebooks(notebooks);
|
||||||
|
|
||||||
return notebooks;
|
return notebooks;
|
||||||
})
|
})
|
||||||
.catch((error) => {
|
.catch((error) => {
|
||||||
console.error("Error fetching notebooks:", error);
|
console.error("Error fetching notebooks:", error.detail);
|
||||||
throw error
|
throw error
|
||||||
});
|
});
|
||||||
}, []);
|
}, [instance]);
|
||||||
|
|
||||||
const updateNotebook = useCallback((updatedNotebook: Notebook) => {
|
const updateNotebook = useCallback((updatedNotebook: Notebook) => {
|
||||||
setNotebooks((existingNotebooks) =>
|
setNotebooks((existingNotebooks) =>
|
||||||
|
|
@ -64,20 +55,13 @@ function useNotebooks() {
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
const saveNotebook = useCallback((notebook: Notebook) => {
|
const saveNotebook = useCallback((notebook: Notebook) => {
|
||||||
return fetch(`/v1/notebooks/${notebook.id}`, {
|
return persistNotebook(notebook.id, {
|
||||||
body: JSON.stringify({
|
name: notebook.name,
|
||||||
name: notebook.name,
|
cells: notebook.cells,
|
||||||
cells: notebook.cells,
|
}, instance);
|
||||||
}),
|
}, [instance]);
|
||||||
method: "PUT",
|
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
}, isCloudEnvironment())
|
|
||||||
.then((response) => response.json())
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const runCell = useCallback((notebook: Notebook, cell: Cell, cogneeInstance: string) => {
|
const runCell = useCallback((notebook: Notebook, cell: Cell) => {
|
||||||
setNotebooks((existingNotebooks) =>
|
setNotebooks((existingNotebooks) =>
|
||||||
existingNotebooks.map((existingNotebook) =>
|
existingNotebooks.map((existingNotebook) =>
|
||||||
existingNotebook.id === notebook.id ? {
|
existingNotebook.id === notebook.id ? {
|
||||||
|
|
@ -89,20 +73,11 @@ function useNotebooks() {
|
||||||
error: undefined,
|
error: undefined,
|
||||||
} : existingCell
|
} : existingCell
|
||||||
),
|
),
|
||||||
} : notebook
|
} : existingNotebook
|
||||||
)
|
)
|
||||||
);
|
);
|
||||||
|
|
||||||
return fetch(`/v1/notebooks/${notebook.id}/${cell.id}/run`, {
|
return runNotebookCell(notebook.id, cell, instance)
|
||||||
body: JSON.stringify({
|
|
||||||
content: cell.content,
|
|
||||||
}),
|
|
||||||
method: "POST",
|
|
||||||
headers: {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
}, cogneeInstance === "cloud")
|
|
||||||
.then((response) => response.json())
|
|
||||||
.then((response) => {
|
.then((response) => {
|
||||||
setNotebooks((existingNotebooks) =>
|
setNotebooks((existingNotebooks) =>
|
||||||
existingNotebooks.map((existingNotebook) =>
|
existingNotebooks.map((existingNotebook) =>
|
||||||
|
|
@ -115,11 +90,11 @@ function useNotebooks() {
|
||||||
error: response.error,
|
error: response.error,
|
||||||
} : existingCell
|
} : existingCell
|
||||||
),
|
),
|
||||||
} : notebook
|
} : existingNotebook
|
||||||
)
|
)
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
}, []);
|
}, [instance]);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
notebooks,
|
notebooks,
|
||||||
|
|
|
||||||
|
|
@ -7,4 +7,4 @@ export default function GitHubIcon({ width = 24, height = 24, color = 'currentCo
|
||||||
</g>
|
</g>
|
||||||
</svg>
|
</svg>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -46,7 +46,7 @@ export default function Header({ user }: HeaderProps) {
|
||||||
|
|
||||||
checkMCPConnection();
|
checkMCPConnection();
|
||||||
const interval = setInterval(checkMCPConnection, 30000);
|
const interval = setInterval(checkMCPConnection, 30000);
|
||||||
|
|
||||||
return () => clearInterval(interval);
|
return () => clearInterval(interval);
|
||||||
}, [setMCPConnected, setMCPDisconnected]);
|
}, [setMCPConnected, setMCPDisconnected]);
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -90,7 +90,7 @@ export default function SearchView() {
|
||||||
scrollToBottom();
|
scrollToBottom();
|
||||||
|
|
||||||
setSearchInputValue("");
|
setSearchInputValue("");
|
||||||
|
|
||||||
// Pass topK to sendMessage
|
// Pass topK to sendMessage
|
||||||
sendMessage(chatInput, searchType, topK)
|
sendMessage(chatInput, searchType, topK)
|
||||||
.then(scrollToBottom)
|
.then(scrollToBottom)
|
||||||
|
|
@ -171,4 +171,4 @@ export default function SearchView() {
|
||||||
</form>
|
</form>
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -1,3 +1,2 @@
|
||||||
export { default as Modal } from "./Modal";
|
export { default as Modal } from "./Modal";
|
||||||
export { default as useModal } from "./useModal";
|
export { default as useModal } from "./useModal";
|
||||||
|
|
||||||
|
|
|
||||||
76
cognee-frontend/src/ui/elements/Notebook/MarkdownPreview.tsx
Normal file
76
cognee-frontend/src/ui/elements/Notebook/MarkdownPreview.tsx
Normal file
|
|
@ -0,0 +1,76 @@
|
||||||
|
import { memo } from "react";
|
||||||
|
import ReactMarkdown from "react-markdown";
|
||||||
|
|
||||||
|
interface MarkdownPreviewProps {
|
||||||
|
content: string;
|
||||||
|
className?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
function MarkdownPreview({ content, className = "" }: MarkdownPreviewProps) {
|
||||||
|
return (
|
||||||
|
<div className={`min-h-24 max-h-96 overflow-y-auto p-4 prose prose-sm max-w-none ${className}`}>
|
||||||
|
<ReactMarkdown
|
||||||
|
components={{
|
||||||
|
h1: ({ children }) => <h1 className="text-2xl font-bold mt-4 mb-2">{children}</h1>,
|
||||||
|
h2: ({ children }) => <h2 className="text-xl font-bold mt-3 mb-2">{children}</h2>,
|
||||||
|
h3: ({ children }) => <h3 className="text-lg font-bold mt-3 mb-2">{children}</h3>,
|
||||||
|
h4: ({ children }) => <h4 className="text-base font-bold mt-2 mb-1">{children}</h4>,
|
||||||
|
h5: ({ children }) => <h5 className="text-sm font-bold mt-2 mb-1">{children}</h5>,
|
||||||
|
h6: ({ children }) => <h6 className="text-xs font-bold mt-2 mb-1">{children}</h6>,
|
||||||
|
p: ({ children }) => <p className="mb-2">{children}</p>,
|
||||||
|
ul: ({ children }) => <ul className="list-disc list-inside mb-2 ml-4">{children}</ul>,
|
||||||
|
ol: ({ children }) => <ol className="list-decimal list-inside mb-2 ml-4">{children}</ol>,
|
||||||
|
li: ({ children }) => <li className="mb-1">{children}</li>,
|
||||||
|
blockquote: ({ children }) => (
|
||||||
|
<blockquote className="border-l-4 border-gray-300 pl-4 italic my-2">{children}</blockquote>
|
||||||
|
),
|
||||||
|
code: ({ className, children, ...props }) => {
|
||||||
|
const isInline = !className;
|
||||||
|
return isInline ? (
|
||||||
|
<code className="bg-gray-100 px-1 py-0.5 rounded text-sm font-mono" {...props}>
|
||||||
|
{children}
|
||||||
|
</code>
|
||||||
|
) : (
|
||||||
|
<code className="block bg-gray-100 p-2 rounded text-sm font-mono overflow-x-auto" {...props}>
|
||||||
|
{children}
|
||||||
|
</code>
|
||||||
|
);
|
||||||
|
},
|
||||||
|
pre: ({ children }) => (
|
||||||
|
<pre className="bg-gray-100 p-2 rounded text-sm font-mono overflow-x-auto mb-2">
|
||||||
|
{children}
|
||||||
|
</pre>
|
||||||
|
),
|
||||||
|
a: ({ href, children }) => (
|
||||||
|
<a href={href} className="text-blue-600 hover:underline" target="_blank" rel="noopener noreferrer">
|
||||||
|
{children}
|
||||||
|
</a>
|
||||||
|
),
|
||||||
|
strong: ({ children }) => <strong className="font-bold">{children}</strong>,
|
||||||
|
em: ({ children }) => <em className="italic">{children}</em>,
|
||||||
|
hr: () => <hr className="my-4 border-gray-300" />,
|
||||||
|
table: ({ children }) => (
|
||||||
|
<div className="overflow-x-auto my-2">
|
||||||
|
<table className="min-w-full border border-gray-300">{children}</table>
|
||||||
|
</div>
|
||||||
|
),
|
||||||
|
thead: ({ children }) => <thead className="bg-gray-100">{children}</thead>,
|
||||||
|
tbody: ({ children }) => <tbody>{children}</tbody>,
|
||||||
|
tr: ({ children }) => <tr className="border-b border-gray-300">{children}</tr>,
|
||||||
|
th: ({ children }) => (
|
||||||
|
<th className="border border-gray-300 px-4 py-2 text-left font-bold">
|
||||||
|
{children}
|
||||||
|
</th>
|
||||||
|
),
|
||||||
|
td: ({ children }) => (
|
||||||
|
<td className="border border-gray-300 px-4 py-2">{children}</td>
|
||||||
|
),
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{content}
|
||||||
|
</ReactMarkdown>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export default memo(MarkdownPreview);
|
||||||
|
|
@ -2,15 +2,17 @@
|
||||||
|
|
||||||
import { v4 as uuid4 } from "uuid";
|
import { v4 as uuid4 } from "uuid";
|
||||||
import classNames from "classnames";
|
import classNames from "classnames";
|
||||||
import { Fragment, MouseEvent, RefObject, useCallback, useEffect, useRef, useState } from "react";
|
import { Fragment, MouseEvent, MutableRefObject, useCallback, useEffect, useRef, useState, memo } from "react";
|
||||||
|
|
||||||
import { useModal } from "@/ui/elements/Modal";
|
import { useModal } from "@/ui/elements/Modal";
|
||||||
import { CaretIcon, CloseIcon, PlusIcon } from "@/ui/Icons";
|
import { CaretIcon, CloseIcon, PlusIcon } from "@/ui/Icons";
|
||||||
import { IconButton, PopupMenu, TextArea, Modal, GhostButton, CTAButton } from "@/ui/elements";
|
import PopupMenu from "@/ui/elements/PopupMenu";
|
||||||
|
import { IconButton, TextArea, Modal, GhostButton, CTAButton } from "@/ui/elements";
|
||||||
import { GraphControlsAPI } from "@/app/(graph)/GraphControls";
|
import { GraphControlsAPI } from "@/app/(graph)/GraphControls";
|
||||||
import GraphVisualization, { GraphVisualizationAPI } from "@/app/(graph)/GraphVisualization";
|
import GraphVisualization, { GraphVisualizationAPI } from "@/app/(graph)/GraphVisualization";
|
||||||
|
|
||||||
import NotebookCellHeader from "./NotebookCellHeader";
|
import NotebookCellHeader from "./NotebookCellHeader";
|
||||||
|
import MarkdownPreview from "./MarkdownPreview";
|
||||||
import { Cell, Notebook as NotebookType } from "./types";
|
import { Cell, Notebook as NotebookType } from "./types";
|
||||||
|
|
||||||
interface NotebookProps {
|
interface NotebookProps {
|
||||||
|
|
@ -19,7 +21,186 @@ interface NotebookProps {
|
||||||
updateNotebook: (updatedNotebook: NotebookType) => void;
|
updateNotebook: (updatedNotebook: NotebookType) => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface NotebookCellProps {
|
||||||
|
cell: Cell;
|
||||||
|
index: number;
|
||||||
|
isOpen: boolean;
|
||||||
|
isMarkdownEditMode: boolean;
|
||||||
|
onToggleOpen: () => void;
|
||||||
|
onToggleMarkdownEdit: () => void;
|
||||||
|
onContentChange: (value: string) => void;
|
||||||
|
onCellRun: (cell: Cell, cogneeInstance: string) => Promise<void>;
|
||||||
|
onCellRename: (cell: Cell) => void;
|
||||||
|
onCellRemove: (cell: Cell) => void;
|
||||||
|
onCellUp: (cell: Cell) => void;
|
||||||
|
onCellDown: (cell: Cell) => void;
|
||||||
|
onCellAdd: (afterCellIndex: number, cellType: "markdown" | "code") => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const NotebookCell = memo(function NotebookCell({
|
||||||
|
cell,
|
||||||
|
index,
|
||||||
|
isOpen,
|
||||||
|
isMarkdownEditMode,
|
||||||
|
onToggleOpen,
|
||||||
|
onToggleMarkdownEdit,
|
||||||
|
onContentChange,
|
||||||
|
onCellRun,
|
||||||
|
onCellRename,
|
||||||
|
onCellRemove,
|
||||||
|
onCellUp,
|
||||||
|
onCellDown,
|
||||||
|
onCellAdd,
|
||||||
|
}: NotebookCellProps) {
|
||||||
|
return (
|
||||||
|
<Fragment>
|
||||||
|
<div className="flex flex-row rounded-xl border-1 border-gray-100">
|
||||||
|
<div className="flex flex-col flex-1 relative">
|
||||||
|
{cell.type === "code" ? (
|
||||||
|
<>
|
||||||
|
<div className="absolute left-[-1.35rem] top-2.5">
|
||||||
|
<IconButton className="p-[0.25rem] m-[-0.25rem]" onClick={onToggleOpen}>
|
||||||
|
<CaretIcon className={classNames("transition-transform", isOpen ? "rotate-0" : "rotate-180")} />
|
||||||
|
</IconButton>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<NotebookCellHeader
|
||||||
|
cell={cell}
|
||||||
|
runCell={onCellRun}
|
||||||
|
renameCell={onCellRename}
|
||||||
|
removeCell={onCellRemove}
|
||||||
|
moveCellUp={onCellUp}
|
||||||
|
moveCellDown={onCellDown}
|
||||||
|
className="rounded-tl-xl rounded-tr-xl"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{isOpen && (
|
||||||
|
<>
|
||||||
|
<TextArea
|
||||||
|
value={cell.content}
|
||||||
|
onChange={onContentChange}
|
||||||
|
isAutoExpanding
|
||||||
|
name="cellInput"
|
||||||
|
placeholder="Type your code here..."
|
||||||
|
className="resize-none min-h-36 max-h-96 overflow-y-auto rounded-tl-none rounded-tr-none rounded-bl-xl rounded-br-xl border-0 !outline-0"
|
||||||
|
/>
|
||||||
|
|
||||||
|
<div className="flex flex-col bg-gray-100 overflow-x-auto max-w-full">
|
||||||
|
{cell.result && (
|
||||||
|
<div className="px-2 py-2">
|
||||||
|
output: <CellResult content={cell.result} />
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{!!cell.error?.length && (
|
||||||
|
<div className="px-2 py-2">
|
||||||
|
error: {cell.error}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<div className="absolute left-[-1.35rem] top-2.5">
|
||||||
|
<IconButton className="p-[0.25rem] m-[-0.25rem]" onClick={onToggleOpen}>
|
||||||
|
<CaretIcon className={classNames("transition-transform", isOpen ? "rotate-0" : "rotate-180")} />
|
||||||
|
</IconButton>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<NotebookCellHeader
|
||||||
|
cell={cell}
|
||||||
|
renameCell={onCellRename}
|
||||||
|
removeCell={onCellRemove}
|
||||||
|
moveCellUp={onCellUp}
|
||||||
|
moveCellDown={onCellDown}
|
||||||
|
className="rounded-tl-xl rounded-tr-xl"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{isOpen && (
|
||||||
|
<div className="relative rounded-tl-none rounded-tr-none rounded-bl-xl rounded-br-xl border-0 overflow-hidden">
|
||||||
|
<GhostButton
|
||||||
|
onClick={onToggleMarkdownEdit}
|
||||||
|
className="absolute top-2 right-2.5 text-xs leading-[1] !px-2 !py-1 !h-auto"
|
||||||
|
>
|
||||||
|
{isMarkdownEditMode ? "Preview" : "Edit"}
|
||||||
|
</GhostButton>
|
||||||
|
{isMarkdownEditMode ? (
|
||||||
|
<TextArea
|
||||||
|
value={cell.content}
|
||||||
|
onChange={onContentChange}
|
||||||
|
isAutoExpanding
|
||||||
|
name="markdownInput"
|
||||||
|
placeholder="Type your markdown here..."
|
||||||
|
className="resize-none min-h-24 max-h-96 overflow-y-auto rounded-tl-none rounded-tr-none rounded-bl-xl rounded-br-xl border-0 !outline-0 !bg-gray-50"
|
||||||
|
/>
|
||||||
|
) : (
|
||||||
|
<MarkdownPreview content={cell.content} className="!bg-gray-50" />
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div className="ml-[-1.35rem]">
|
||||||
|
<PopupMenu
|
||||||
|
openToRight={true}
|
||||||
|
triggerElement={<PlusIcon />}
|
||||||
|
triggerClassName="p-[0.25rem] m-[-0.25rem]"
|
||||||
|
>
|
||||||
|
<div className="flex flex-col gap-0.5">
|
||||||
|
<button
|
||||||
|
onClick={() => onCellAdd(index, "markdown")}
|
||||||
|
className="hover:bg-gray-100 w-full text-left px-2 cursor-pointer"
|
||||||
|
>
|
||||||
|
<span>text</span>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<div
|
||||||
|
onClick={() => onCellAdd(index, "code")}
|
||||||
|
className="hover:bg-gray-100 w-full text-left px-2 cursor-pointer"
|
||||||
|
>
|
||||||
|
<span>code</span>
|
||||||
|
</div>
|
||||||
|
</PopupMenu>
|
||||||
|
</div>
|
||||||
|
</Fragment>
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
export default function Notebook({ notebook, updateNotebook, runCell }: NotebookProps) {
|
export default function Notebook({ notebook, updateNotebook, runCell }: NotebookProps) {
|
||||||
|
const [openCells, setOpenCells] = useState(new Set(notebook.cells.map((c: Cell) => c.id)));
|
||||||
|
const [markdownEditMode, setMarkdownEditMode] = useState<Set<string>>(new Set());
|
||||||
|
|
||||||
|
const toggleCellOpen = useCallback((id: string) => {
|
||||||
|
setOpenCells((prev) => {
|
||||||
|
const newState = new Set(prev);
|
||||||
|
|
||||||
|
if (newState.has(id)) {
|
||||||
|
newState.delete(id)
|
||||||
|
} else {
|
||||||
|
newState.add(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
return newState;
|
||||||
|
});
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const toggleMarkdownEditMode = useCallback((id: string) => {
|
||||||
|
setMarkdownEditMode((prev) => {
|
||||||
|
const newState = new Set(prev);
|
||||||
|
|
||||||
|
if (newState.has(id)) {
|
||||||
|
newState.delete(id);
|
||||||
|
} else {
|
||||||
|
newState.add(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
return newState;
|
||||||
|
});
|
||||||
|
}, []);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (notebook.cells.length === 0) {
|
if (notebook.cells.length === 0) {
|
||||||
const newCell: Cell = {
|
const newCell: Cell = {
|
||||||
|
|
@ -34,7 +215,7 @@ export default function Notebook({ notebook, updateNotebook, runCell }: Notebook
|
||||||
});
|
});
|
||||||
toggleCellOpen(newCell.id)
|
toggleCellOpen(newCell.id)
|
||||||
}
|
}
|
||||||
}, [notebook, updateNotebook]);
|
}, [notebook, updateNotebook, toggleCellOpen]);
|
||||||
|
|
||||||
const handleCellRun = useCallback((cell: Cell, cogneeInstance: string) => {
|
const handleCellRun = useCallback((cell: Cell, cogneeInstance: string) => {
|
||||||
return runCell(notebook, cell, cogneeInstance);
|
return runCell(notebook, cell, cogneeInstance);
|
||||||
|
|
@ -43,7 +224,7 @@ export default function Notebook({ notebook, updateNotebook, runCell }: Notebook
|
||||||
const handleCellAdd = useCallback((afterCellIndex: number, cellType: "markdown" | "code") => {
|
const handleCellAdd = useCallback((afterCellIndex: number, cellType: "markdown" | "code") => {
|
||||||
const newCell: Cell = {
|
const newCell: Cell = {
|
||||||
id: uuid4(),
|
id: uuid4(),
|
||||||
name: "new cell",
|
name: cellType === "markdown" ? "Markdown Cell" : "Code Cell",
|
||||||
type: cellType,
|
type: cellType,
|
||||||
content: "",
|
content: "",
|
||||||
};
|
};
|
||||||
|
|
@ -59,7 +240,7 @@ export default function Notebook({ notebook, updateNotebook, runCell }: Notebook
|
||||||
|
|
||||||
toggleCellOpen(newCell.id);
|
toggleCellOpen(newCell.id);
|
||||||
updateNotebook(newNotebook);
|
updateNotebook(newNotebook);
|
||||||
}, [notebook, updateNotebook]);
|
}, [notebook, updateNotebook, toggleCellOpen]);
|
||||||
|
|
||||||
const removeCell = useCallback((cell: Cell, event?: MouseEvent) => {
|
const removeCell = useCallback((cell: Cell, event?: MouseEvent) => {
|
||||||
event?.preventDefault();
|
event?.preventDefault();
|
||||||
|
|
@ -81,14 +262,12 @@ export default function Notebook({ notebook, updateNotebook, runCell }: Notebook
|
||||||
openCellRemoveConfirmModal(cell);
|
openCellRemoveConfirmModal(cell);
|
||||||
}, [openCellRemoveConfirmModal]);
|
}, [openCellRemoveConfirmModal]);
|
||||||
|
|
||||||
const handleCellInputChange = useCallback((notebook: NotebookType, cell: Cell, value: string) => {
|
const handleCellInputChange = useCallback((cellId: string, value: string) => {
|
||||||
const newCell = {...cell, content: value };
|
|
||||||
|
|
||||||
updateNotebook({
|
updateNotebook({
|
||||||
...notebook,
|
...notebook,
|
||||||
cells: notebook.cells.map((cell: Cell) => (cell.id === newCell.id ? newCell : cell)),
|
cells: notebook.cells.map((cell: Cell) => (cell.id === cellId ? {...cell, content: value} : cell)),
|
||||||
});
|
});
|
||||||
}, [updateNotebook]);
|
}, [notebook, updateNotebook]);
|
||||||
|
|
||||||
const handleCellUp = useCallback((cell: Cell) => {
|
const handleCellUp = useCallback((cell: Cell) => {
|
||||||
const index = notebook.cells.indexOf(cell);
|
const index = notebook.cells.indexOf(cell);
|
||||||
|
|
@ -131,133 +310,28 @@ export default function Notebook({ notebook, updateNotebook, runCell }: Notebook
|
||||||
}
|
}
|
||||||
}, [notebook, updateNotebook]);
|
}, [notebook, updateNotebook]);
|
||||||
|
|
||||||
const [openCells, setOpenCells] = useState(new Set(notebook.cells.map((c: Cell) => c.id)));
|
|
||||||
|
|
||||||
const toggleCellOpen = (id: string) => {
|
|
||||||
setOpenCells((prev) => {
|
|
||||||
const newState = new Set(prev);
|
|
||||||
|
|
||||||
if (newState.has(id)) {
|
|
||||||
newState.delete(id)
|
|
||||||
} else {
|
|
||||||
newState.add(id);
|
|
||||||
}
|
|
||||||
|
|
||||||
return newState;
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<>
|
<>
|
||||||
<div className="bg-white rounded-xl flex flex-col gap-0.5 px-7 py-5 flex-1">
|
<div className="bg-white rounded-xl flex flex-col gap-0.5 px-7 py-5 flex-1">
|
||||||
<div className="mb-5">{notebook.name}</div>
|
<div className="mb-5">{notebook.name}</div>
|
||||||
|
|
||||||
{notebook.cells.map((cell: Cell, index) => (
|
{notebook.cells.map((cell: Cell, index) => (
|
||||||
<Fragment key={cell.id}>
|
<NotebookCell
|
||||||
<div key={cell.id} className="flex flex-row rounded-xl border-1 border-gray-100">
|
key={cell.id}
|
||||||
<div className="flex flex-col flex-1 relative">
|
cell={cell}
|
||||||
{cell.type === "code" ? (
|
index={index}
|
||||||
<>
|
isOpen={openCells.has(cell.id)}
|
||||||
<div className="absolute left-[-1.35rem] top-2.5">
|
isMarkdownEditMode={markdownEditMode.has(cell.id)}
|
||||||
<IconButton className="p-[0.25rem] m-[-0.25rem]" onClick={toggleCellOpen.bind(null, cell.id)}>
|
onToggleOpen={() => toggleCellOpen(cell.id)}
|
||||||
<CaretIcon className={classNames("transition-transform", openCells.has(cell.id) ? "rotate-0" : "rotate-180")} />
|
onToggleMarkdownEdit={() => toggleMarkdownEditMode(cell.id)}
|
||||||
</IconButton>
|
onContentChange={(value) => handleCellInputChange(cell.id, value)}
|
||||||
</div>
|
onCellRun={handleCellRun}
|
||||||
|
onCellRename={handleCellRename}
|
||||||
<NotebookCellHeader
|
onCellRemove={handleCellRemove}
|
||||||
cell={cell}
|
onCellUp={handleCellUp}
|
||||||
runCell={handleCellRun}
|
onCellDown={handleCellDown}
|
||||||
renameCell={handleCellRename}
|
onCellAdd={handleCellAdd}
|
||||||
removeCell={handleCellRemove}
|
/>
|
||||||
moveCellUp={handleCellUp}
|
|
||||||
moveCellDown={handleCellDown}
|
|
||||||
className="rounded-tl-xl rounded-tr-xl"
|
|
||||||
/>
|
|
||||||
|
|
||||||
{openCells.has(cell.id) && (
|
|
||||||
<>
|
|
||||||
<TextArea
|
|
||||||
value={cell.content}
|
|
||||||
onChange={handleCellInputChange.bind(null, notebook, cell)}
|
|
||||||
// onKeyUp={handleCellRunOnEnter}
|
|
||||||
isAutoExpanding
|
|
||||||
name="cellInput"
|
|
||||||
placeholder="Type your code here..."
|
|
||||||
contentEditable={true}
|
|
||||||
className="resize-none min-h-36 max-h-96 overflow-y-auto rounded-tl-none rounded-tr-none rounded-bl-xl rounded-br-xl border-0 !outline-0"
|
|
||||||
/>
|
|
||||||
|
|
||||||
<div className="flex flex-col bg-gray-100 overflow-x-auto max-w-full">
|
|
||||||
{cell.result && (
|
|
||||||
<div className="px-2 py-2">
|
|
||||||
output: <CellResult content={cell.result} />
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
{!!cell.error?.length && (
|
|
||||||
<div className="px-2 py-2">
|
|
||||||
error: {cell.error}
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
</>
|
|
||||||
)}
|
|
||||||
</>
|
|
||||||
) : (
|
|
||||||
<>
|
|
||||||
<div className="absolute left-[-1.35rem] top-2.5">
|
|
||||||
<IconButton className="p-[0.25rem] m-[-0.25rem]" onClick={toggleCellOpen.bind(null, cell.id)}>
|
|
||||||
<CaretIcon className={classNames("transition-transform", openCells.has(cell.id) ? "rotate-0" : "rotate-180")} />
|
|
||||||
</IconButton>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<NotebookCellHeader
|
|
||||||
cell={cell}
|
|
||||||
renameCell={handleCellRename}
|
|
||||||
removeCell={handleCellRemove}
|
|
||||||
moveCellUp={handleCellUp}
|
|
||||||
moveCellDown={handleCellDown}
|
|
||||||
className="rounded-tl-xl rounded-tr-xl"
|
|
||||||
/>
|
|
||||||
|
|
||||||
{openCells.has(cell.id) && (
|
|
||||||
<TextArea
|
|
||||||
value={cell.content}
|
|
||||||
onChange={handleCellInputChange.bind(null, notebook, cell)}
|
|
||||||
// onKeyUp={handleCellRunOnEnter}
|
|
||||||
isAutoExpanding
|
|
||||||
name="cellInput"
|
|
||||||
placeholder="Type your text here..."
|
|
||||||
contentEditable={true}
|
|
||||||
className="resize-none min-h-24 max-h-96 overflow-y-auto rounded-tl-none rounded-tr-none rounded-bl-xl rounded-br-xl border-0 !outline-0"
|
|
||||||
/>
|
|
||||||
)}
|
|
||||||
</>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div className="ml-[-1.35rem]">
|
|
||||||
<PopupMenu
|
|
||||||
openToRight={true}
|
|
||||||
triggerElement={<PlusIcon />}
|
|
||||||
triggerClassName="p-[0.25rem] m-[-0.25rem]"
|
|
||||||
>
|
|
||||||
<div className="flex flex-col gap-0.5">
|
|
||||||
<button
|
|
||||||
onClick={() => handleCellAdd(index, "markdown")}
|
|
||||||
className="hover:bg-gray-100 w-full text-left px-2 cursor-pointer"
|
|
||||||
>
|
|
||||||
<span>text</span>
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div
|
|
||||||
onClick={() => handleCellAdd(index, "code")}
|
|
||||||
className="hover:bg-gray-100 w-full text-left px-2 cursor-pointer"
|
|
||||||
>
|
|
||||||
<span>code</span>
|
|
||||||
</div>
|
|
||||||
</PopupMenu>
|
|
||||||
</div>
|
|
||||||
</Fragment>
|
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
|
@ -288,6 +362,10 @@ function CellResult({ content }: { content: [] }) {
|
||||||
getSelectedNode: () => null,
|
getSelectedNode: () => null,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
if (content.length === 0) {
|
||||||
|
return <span>OK</span>;
|
||||||
|
}
|
||||||
|
|
||||||
for (const line of content) {
|
for (const line of content) {
|
||||||
try {
|
try {
|
||||||
if (Array.isArray(line)) {
|
if (Array.isArray(line)) {
|
||||||
|
|
@ -298,7 +376,7 @@ function CellResult({ content }: { content: [] }) {
|
||||||
<span className="text-sm pl-2 mb-4">reasoning graph</span>
|
<span className="text-sm pl-2 mb-4">reasoning graph</span>
|
||||||
<GraphVisualization
|
<GraphVisualization
|
||||||
data={transformInsightsGraphData(line)}
|
data={transformInsightsGraphData(line)}
|
||||||
ref={graphRef as RefObject<GraphVisualizationAPI>}
|
ref={graphRef as MutableRefObject<GraphVisualizationAPI>}
|
||||||
graphControls={graphControls}
|
graphControls={graphControls}
|
||||||
className="min-h-80"
|
className="min-h-80"
|
||||||
/>
|
/>
|
||||||
|
|
@ -346,7 +424,7 @@ function CellResult({ content }: { content: [] }) {
|
||||||
<span className="text-sm pl-2 mb-4">reasoning graph (datasets: {datasetName})</span>
|
<span className="text-sm pl-2 mb-4">reasoning graph (datasets: {datasetName})</span>
|
||||||
<GraphVisualization
|
<GraphVisualization
|
||||||
data={transformToVisualizationData(graph)}
|
data={transformToVisualizationData(graph)}
|
||||||
ref={graphRef as RefObject<GraphVisualizationAPI>}
|
ref={graphRef as MutableRefObject<GraphVisualizationAPI>}
|
||||||
graphControls={graphControls}
|
graphControls={graphControls}
|
||||||
className="min-h-80"
|
className="min-h-80"
|
||||||
/>
|
/>
|
||||||
|
|
@ -356,8 +434,7 @@ function CellResult({ content }: { content: [] }) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
else if (typeof(line) === "object" && line["result"] && typeof(line["result"]) === "string") {
|
||||||
if (typeof(line) === "object" && line["result"] && typeof(line["result"]) === "string") {
|
|
||||||
const datasets = Array.from(
|
const datasets = Array.from(
|
||||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||||
new Set(Object.values(line["datasets"]).map((dataset: any) => dataset.name))
|
new Set(Object.values(line["datasets"]).map((dataset: any) => dataset.name))
|
||||||
|
|
@ -369,39 +446,46 @@ function CellResult({ content }: { content: [] }) {
|
||||||
<span className="block px-2 py-2 whitespace-normal">{line["result"]}</span>
|
<span className="block px-2 py-2 whitespace-normal">{line["result"]}</span>
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
|
||||||
if (typeof(line) === "object" && line["graphs"]) {
|
|
||||||
Object.entries<{ nodes: []; edges: []; }>(line["graphs"]).forEach(([datasetName, graph]) => {
|
|
||||||
parsedContent.push(
|
|
||||||
<div key={datasetName} className="w-full h-full bg-white">
|
|
||||||
<span className="text-sm pl-2 mb-4">reasoning graph (datasets: {datasetName})</span>
|
|
||||||
<GraphVisualization
|
|
||||||
data={transformToVisualizationData(graph)}
|
|
||||||
ref={graphRef as RefObject<GraphVisualizationAPI>}
|
|
||||||
graphControls={graphControls}
|
|
||||||
className="min-h-80"
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if (typeof(line) === "object" && line["result"] && typeof(line["result"]) === "object") {
|
if (line["graphs"]) {
|
||||||
|
Object.entries<{ nodes: []; edges: []; }>(line["graphs"]).forEach(([datasetName, graph]) => {
|
||||||
|
parsedContent.push(
|
||||||
|
<div key={datasetName} className="w-full h-full bg-white">
|
||||||
|
<span className="text-sm pl-2 mb-4">reasoning graph (datasets: {datasetName})</span>
|
||||||
|
<GraphVisualization
|
||||||
|
data={transformToVisualizationData(graph)}
|
||||||
|
ref={graphRef as MutableRefObject<GraphVisualizationAPI>}
|
||||||
|
graphControls={graphControls}
|
||||||
|
className="min-h-80"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (typeof(line) === "object" && line["result"] && typeof(line["result"]) === "object") {
|
||||||
parsedContent.push(
|
parsedContent.push(
|
||||||
<pre className="px-2 w-full h-full bg-white text-sm" key={String(line).slice(0, -10)}>
|
<pre className="px-2 w-full h-full bg-white text-sm" key={String(line).slice(0, -10)}>
|
||||||
{JSON.stringify(line["result"], null, 2)}
|
{JSON.stringify(line["result"], null, 2)}
|
||||||
</pre>
|
</pre>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
if (typeof(line) === "string") {
|
else if (typeof(line) === "object") {
|
||||||
|
parsedContent.push(
|
||||||
|
<pre className="px-2 w-full h-full bg-white text-sm" key={String(line).slice(0, -10)}>
|
||||||
|
{JSON.stringify(line, null, 2)}
|
||||||
|
</pre>
|
||||||
|
)
|
||||||
|
}
|
||||||
|
else if (typeof(line) === "string") {
|
||||||
parsedContent.push(
|
parsedContent.push(
|
||||||
<pre className="px-2 w-full h-full bg-white text-sm whitespace-normal" key={String(line).slice(0, -10)}>
|
<pre className="px-2 w-full h-full bg-white text-sm whitespace-normal" key={String(line).slice(0, -10)}>
|
||||||
{line}
|
{line}
|
||||||
</pre>
|
</pre>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch {
|
||||||
console.error(error);
|
// It is fine if we don't manage to parse the output line, we show it as it is.
|
||||||
parsedContent.push(
|
parsedContent.push(
|
||||||
<pre className="px-2 w-full h-full bg-white text-sm whitespace-normal" key={String(line).slice(0, -10)}>
|
<pre className="px-2 w-full h-full bg-white text-sm whitespace-normal" key={String(line).slice(0, -10)}>
|
||||||
{line}
|
{line}
|
||||||
|
|
@ -415,7 +499,6 @@ function CellResult({ content }: { content: [] }) {
|
||||||
{item}
|
{item}
|
||||||
</div>
|
</div>
|
||||||
));
|
));
|
||||||
|
|
||||||
};
|
};
|
||||||
|
|
||||||
function transformToVisualizationData(graph: { nodes: [], edges: [] }) {
|
function transformToVisualizationData(graph: { nodes: [], edges: [] }) {
|
||||||
|
|
@ -451,7 +534,7 @@ function transformInsightsGraphData(triplets: Triplet[]) {
|
||||||
target: string,
|
target: string,
|
||||||
label: string,
|
label: string,
|
||||||
}
|
}
|
||||||
} = {};
|
} = {};
|
||||||
|
|
||||||
for (const triplet of triplets) {
|
for (const triplet of triplets) {
|
||||||
nodes[triplet[0].id] = {
|
nodes[triplet[0].id] = {
|
||||||
|
|
@ -471,7 +554,7 @@ function transformInsightsGraphData(triplets: Triplet[]) {
|
||||||
label: triplet[1]["relationship_name"],
|
label: triplet[1]["relationship_name"],
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
return {
|
return {
|
||||||
nodes: Object.values(nodes),
|
nodes: Object.values(nodes),
|
||||||
links: Object.values(links),
|
links: Object.values(links),
|
||||||
|
|
|
||||||
|
|
@ -1,9 +1,12 @@
|
||||||
|
"use client";
|
||||||
|
|
||||||
import { useState } from "react";
|
import { useState } from "react";
|
||||||
import classNames from "classnames";
|
import classNames from "classnames";
|
||||||
|
|
||||||
import { isCloudEnvironment, useBoolean } from "@/utils";
|
import { isCloudEnvironment, useBoolean } from "@/utils";
|
||||||
import { PlayIcon } from "@/ui/Icons";
|
import { PlayIcon } from "@/ui/Icons";
|
||||||
import { PopupMenu, IconButton } from "@/ui/elements";
|
import PopupMenu from "@/ui/elements/PopupMenu";
|
||||||
|
import { IconButton } from "@/ui/elements";
|
||||||
import { LoadingIndicator } from "@/ui/App";
|
import { LoadingIndicator } from "@/ui/App";
|
||||||
|
|
||||||
import { Cell } from "./types";
|
import { Cell } from "./types";
|
||||||
|
|
@ -39,7 +42,7 @@ export default function NotebookCellHeader({
|
||||||
if (runCell) {
|
if (runCell) {
|
||||||
setIsRunningCell();
|
setIsRunningCell();
|
||||||
runCell(cell, runInstance)
|
runCell(cell, runInstance)
|
||||||
.then(() => {
|
.finally(() => {
|
||||||
setIsNotRunningCell();
|
setIsNotRunningCell();
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
@ -53,7 +56,7 @@ export default function NotebookCellHeader({
|
||||||
{isRunningCell ? <LoadingIndicator /> : <IconButton onClick={handleCellRun}><PlayIcon /></IconButton>}
|
{isRunningCell ? <LoadingIndicator /> : <IconButton onClick={handleCellRun}><PlayIcon /></IconButton>}
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
<span className="ml-4">{cell.name}</span>
|
<span className="ml-4">{cell.type === "markdown" ? "Markdown Cell" : cell.name}</span>
|
||||||
</div>
|
</div>
|
||||||
<div className="pr-4 flex flex-row items-center gap-8">
|
<div className="pr-4 flex flex-row items-center gap-8">
|
||||||
{runCell && (
|
{runCell && (
|
||||||
|
|
|
||||||
|
|
@ -1,12 +1,12 @@
|
||||||
"use client";
|
"use client";
|
||||||
|
|
||||||
import classNames from "classnames";
|
import classNames from "classnames";
|
||||||
import { InputHTMLAttributes, useCallback, useEffect, useLayoutEffect, useRef } from "react"
|
import { InputHTMLAttributes, useCallback, useEffect, useRef } from "react"
|
||||||
|
|
||||||
interface TextAreaProps extends Omit<InputHTMLAttributes<HTMLTextAreaElement>, "onChange"> {
|
interface TextAreaProps extends Omit<InputHTMLAttributes<HTMLTextAreaElement>, "onChange"> {
|
||||||
isAutoExpanding?: boolean; // Set to true to enable auto-expanding text area behavior. Default is false.
|
isAutoExpanding?: boolean; // Set to true to enable auto-expanding text area behavior. Default is false.
|
||||||
value: string;
|
value?: string;
|
||||||
onChange: (value: string) => void;
|
onChange?: (value: string) => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
export default function TextArea({
|
export default function TextArea({
|
||||||
|
|
@ -19,95 +19,81 @@ export default function TextArea({
|
||||||
placeholder = "",
|
placeholder = "",
|
||||||
onKeyUp,
|
onKeyUp,
|
||||||
...props
|
...props
|
||||||
}: TextAreaProps) {
|
}: TextAreaProps) {
|
||||||
const handleTextChange = useCallback((event: Event) => {
|
const textareaRef = useRef<HTMLTextAreaElement>(null);
|
||||||
const fakeTextAreaElement = event.target as HTMLDivElement;
|
const maxHeightRef = useRef<number | null>(null);
|
||||||
const newValue = fakeTextAreaElement.innerText;
|
const throttleTimeoutRef = useRef<number | null>(null);
|
||||||
|
const lastAdjustTimeRef = useRef<number>(0);
|
||||||
|
const THROTTLE_MS = 250; // 4 calculations per second
|
||||||
|
|
||||||
|
const adjustHeight = useCallback(() => {
|
||||||
|
if (!isAutoExpanding || !textareaRef.current) return;
|
||||||
|
|
||||||
|
const textarea = textareaRef.current;
|
||||||
|
|
||||||
|
// Cache maxHeight on first calculation
|
||||||
|
if (maxHeightRef.current === null) {
|
||||||
|
const computedStyle = getComputedStyle(textarea);
|
||||||
|
maxHeightRef.current = computedStyle.maxHeight === "none"
|
||||||
|
? Infinity
|
||||||
|
: parseInt(computedStyle.maxHeight) || Infinity;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reset height to auto to get the correct scrollHeight
|
||||||
|
textarea.style.height = "auto";
|
||||||
|
// Set height to scrollHeight, but respect max-height
|
||||||
|
const scrollHeight = textarea.scrollHeight;
|
||||||
|
textarea.style.height = `${Math.min(scrollHeight, maxHeightRef.current)}px`;
|
||||||
|
lastAdjustTimeRef.current = Date.now();
|
||||||
|
}, [isAutoExpanding]);
|
||||||
|
|
||||||
|
const handleChange = useCallback((event: React.ChangeEvent<HTMLTextAreaElement>) => {
|
||||||
|
const newValue = event.target.value;
|
||||||
onChange?.(newValue);
|
onChange?.(newValue);
|
||||||
}, [onChange]);
|
|
||||||
|
|
||||||
const handleKeyUp = useCallback((event: Event) => {
|
// Throttle height adjustments to avoid blocking typing
|
||||||
if (onKeyUp) {
|
if (isAutoExpanding) {
|
||||||
onKeyUp(event as unknown as React.KeyboardEvent<HTMLTextAreaElement>);
|
const now = Date.now();
|
||||||
}
|
const timeSinceLastAdjust = now - lastAdjustTimeRef.current;
|
||||||
}, [onKeyUp]);
|
|
||||||
|
|
||||||
const handleTextAreaFocus = (event: React.FocusEvent<HTMLDivElement>) => {
|
if (timeSinceLastAdjust >= THROTTLE_MS) {
|
||||||
if (event.target.innerText.trim() === placeholder) {
|
adjustHeight();
|
||||||
event.target.innerText = "";
|
} else {
|
||||||
}
|
if (throttleTimeoutRef.current !== null) {
|
||||||
};
|
clearTimeout(throttleTimeoutRef.current);
|
||||||
const handleTextAreaBlur = (event: React.FocusEvent<HTMLDivElement>) => {
|
}
|
||||||
if (value === "") {
|
throttleTimeoutRef.current = window.setTimeout(() => {
|
||||||
event.target.innerText = placeholder;
|
adjustHeight();
|
||||||
}
|
throttleTimeoutRef.current = null;
|
||||||
};
|
}, THROTTLE_MS - timeSinceLastAdjust);
|
||||||
|
|
||||||
const handleChange = (event: React.ChangeEvent<HTMLTextAreaElement>) => {
|
|
||||||
onChange(event.target.value);
|
|
||||||
};
|
|
||||||
|
|
||||||
const fakeTextAreaRef = useRef<HTMLDivElement>(null);
|
|
||||||
|
|
||||||
useLayoutEffect(() => {
|
|
||||||
const fakeTextAreaElement = fakeTextAreaRef.current;
|
|
||||||
|
|
||||||
if (fakeTextAreaElement && fakeTextAreaElement.innerText.trim() !== "") {
|
|
||||||
fakeTextAreaElement.innerText = placeholder;
|
|
||||||
}
|
|
||||||
}, [placeholder]);
|
|
||||||
|
|
||||||
useLayoutEffect(() => {
|
|
||||||
const fakeTextAreaElement = fakeTextAreaRef.current;
|
|
||||||
|
|
||||||
if (fakeTextAreaElement) {
|
|
||||||
fakeTextAreaElement.addEventListener("input", handleTextChange);
|
|
||||||
fakeTextAreaElement.addEventListener("keyup", handleKeyUp);
|
|
||||||
}
|
|
||||||
|
|
||||||
return () => {
|
|
||||||
if (fakeTextAreaElement) {
|
|
||||||
fakeTextAreaElement.removeEventListener("input", handleTextChange);
|
|
||||||
fakeTextAreaElement.removeEventListener("keyup", handleKeyUp);
|
|
||||||
}
|
}
|
||||||
};
|
}
|
||||||
}, [handleKeyUp, handleTextChange]);
|
}, [onChange, isAutoExpanding, adjustHeight]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const fakeTextAreaElement = fakeTextAreaRef.current;
|
if (isAutoExpanding && textareaRef.current) {
|
||||||
const textAreaText = fakeTextAreaElement?.innerText;
|
adjustHeight();
|
||||||
|
|
||||||
if (fakeTextAreaElement && (value === "" || value === "\n")) {
|
|
||||||
fakeTextAreaElement.innerText = placeholder;
|
|
||||||
return;
|
|
||||||
}
|
}
|
||||||
|
}, [value, isAutoExpanding, adjustHeight]);
|
||||||
|
|
||||||
if (fakeTextAreaElement && textAreaText !== value) {
|
useEffect(() => {
|
||||||
fakeTextAreaElement.innerText = value;
|
return () => {
|
||||||
}
|
if (throttleTimeoutRef.current !== null) {
|
||||||
}, [placeholder, value]);
|
clearTimeout(throttleTimeoutRef.current);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}, []);
|
||||||
|
|
||||||
return isAutoExpanding ? (
|
return (
|
||||||
<>
|
|
||||||
<div
|
|
||||||
ref={fakeTextAreaRef}
|
|
||||||
contentEditable="true"
|
|
||||||
role="textbox"
|
|
||||||
aria-multiline="true"
|
|
||||||
className={classNames("block w-full rounded-md bg-white px-4 py-4 text-base text-gray-900 outline-1 -outline-offset-1 outline-gray-300 placeholder:text-gray-400 focus:outline-2 focus:-outline-offset-2 focus:outline-indigo-600", className)}
|
|
||||||
onFocus={handleTextAreaFocus}
|
|
||||||
onBlur={handleTextAreaBlur}
|
|
||||||
/>
|
|
||||||
</>
|
|
||||||
) : (
|
|
||||||
<textarea
|
<textarea
|
||||||
|
ref={isAutoExpanding ? textareaRef : undefined}
|
||||||
name={name}
|
name={name}
|
||||||
style={style}
|
style={style}
|
||||||
value={value}
|
value={value}
|
||||||
placeholder={placeholder}
|
placeholder={placeholder}
|
||||||
className={classNames("block w-full rounded-md bg-white px-4 py-4 text-base text-gray-900 outline-1 -outline-offset-1 outline-gray-300 placeholder:text-gray-400 focus:outline-2 focus:-outline-offset-2 focus:outline-indigo-600", className)}
|
className={classNames("block w-full rounded-md bg-white px-4 py-4 text-base text-gray-900 outline-1 -outline-offset-1 outline-gray-300 placeholder:text-gray-400 focus:outline-2 focus:-outline-offset-2 focus:outline-indigo-600", className)}
|
||||||
onChange={handleChange}
|
onChange={handleChange}
|
||||||
|
onKeyUp={onKeyUp}
|
||||||
{...props}
|
{...props}
|
||||||
/>
|
/>
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -10,4 +10,4 @@ export { default as NeutralButton } from "./NeutralButton";
|
||||||
export { default as StatusIndicator } from "./StatusIndicator";
|
export { default as StatusIndicator } from "./StatusIndicator";
|
||||||
export { default as StatusDot } from "./StatusDot";
|
export { default as StatusDot } from "./StatusDot";
|
||||||
export { default as Accordion } from "./Accordion";
|
export { default as Accordion } from "./Accordion";
|
||||||
export { default as Notebook } from "./Notebook";
|
export { default as Notebook } from "./Notebook";
|
||||||
|
|
|
||||||
|
|
@ -57,7 +57,7 @@ export default async function fetch(url: string, options: RequestInit = {}, useC
|
||||||
new Error("Backend server is not responding. Please check if the server is running.")
|
new Error("Backend server is not responding. Please check if the server is running.")
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (error.detail === undefined) {
|
if (error.detail === undefined) {
|
||||||
return Promise.reject(
|
return Promise.reject(
|
||||||
new Error("No connection to the server.")
|
new Error("No connection to the server.")
|
||||||
|
|
@ -74,7 +74,7 @@ export default async function fetch(url: string, options: RequestInit = {}, useC
|
||||||
fetch.checkHealth = async () => {
|
fetch.checkHealth = async () => {
|
||||||
const maxRetries = 5;
|
const maxRetries = 5;
|
||||||
const retryDelay = 1000; // 1 second
|
const retryDelay = 1000; // 1 second
|
||||||
|
|
||||||
for (let i = 0; i < maxRetries; i++) {
|
for (let i = 0; i < maxRetries; i++) {
|
||||||
try {
|
try {
|
||||||
const response = await global.fetch(`${backendApiUrl.replace("/api", "")}/health`);
|
const response = await global.fetch(`${backendApiUrl.replace("/api", "")}/health`);
|
||||||
|
|
@ -90,7 +90,7 @@ fetch.checkHealth = async () => {
|
||||||
await new Promise(resolve => setTimeout(resolve, retryDelay));
|
await new Promise(resolve => setTimeout(resolve, retryDelay));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
throw new Error("Backend server is not responding after multiple attempts");
|
throw new Error("Backend server is not responding after multiple attempts");
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,8 +1,12 @@
|
||||||
import { redirect } from "next/navigation";
|
import { redirect } from "next/navigation";
|
||||||
|
|
||||||
export default function handleServerErrors(response: Response, retry?: (response: Response) => Promise<Response>, useCloud?: boolean): Promise<Response> {
|
export default function handleServerErrors(
|
||||||
|
response: Response,
|
||||||
|
retry: ((response: Response) => Promise<Response>) | null = null,
|
||||||
|
useCloud: boolean = false,
|
||||||
|
): Promise<Response> {
|
||||||
return new Promise((resolve, reject) => {
|
return new Promise((resolve, reject) => {
|
||||||
if (response.status === 401 && !useCloud) {
|
if ((response.status === 401 || response.status === 403) && !useCloud) {
|
||||||
if (retry) {
|
if (retry) {
|
||||||
return retry(response)
|
return retry(response)
|
||||||
.catch(() => {
|
.catch(() => {
|
||||||
|
|
|
||||||
|
|
@ -105,14 +105,14 @@ If you'd rather run cognee-mcp in a container, you have two options:
|
||||||
```bash
|
```bash
|
||||||
# For HTTP transport (recommended for web deployments)
|
# For HTTP transport (recommended for web deployments)
|
||||||
docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
|
docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
|
||||||
# For SSE transport
|
# For SSE transport
|
||||||
docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
|
docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
|
||||||
# For stdio transport (default)
|
# For stdio transport (default)
|
||||||
docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main
|
docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main
|
||||||
```
|
```
|
||||||
|
|
||||||
**Installing optional dependencies at runtime:**
|
**Installing optional dependencies at runtime:**
|
||||||
|
|
||||||
You can install optional dependencies when running the container by setting the `EXTRAS` environment variable:
|
You can install optional dependencies when running the container by setting the `EXTRAS` environment variable:
|
||||||
```bash
|
```bash
|
||||||
# Install a single optional dependency group at runtime
|
# Install a single optional dependency group at runtime
|
||||||
|
|
@ -122,7 +122,7 @@ If you'd rather run cognee-mcp in a container, you have two options:
|
||||||
--env-file ./.env \
|
--env-file ./.env \
|
||||||
-p 8000:8000 \
|
-p 8000:8000 \
|
||||||
--rm -it cognee/cognee-mcp:main
|
--rm -it cognee/cognee-mcp:main
|
||||||
|
|
||||||
# Install multiple optional dependency groups at runtime (comma-separated)
|
# Install multiple optional dependency groups at runtime (comma-separated)
|
||||||
docker run \
|
docker run \
|
||||||
-e TRANSPORT_MODE=sse \
|
-e TRANSPORT_MODE=sse \
|
||||||
|
|
@ -131,7 +131,7 @@ If you'd rather run cognee-mcp in a container, you have two options:
|
||||||
-p 8000:8000 \
|
-p 8000:8000 \
|
||||||
--rm -it cognee/cognee-mcp:main
|
--rm -it cognee/cognee-mcp:main
|
||||||
```
|
```
|
||||||
|
|
||||||
**Available optional dependency groups:**
|
**Available optional dependency groups:**
|
||||||
- `aws` - S3 storage support
|
- `aws` - S3 storage support
|
||||||
- `postgres` / `postgres-binary` - PostgreSQL database support
|
- `postgres` / `postgres-binary` - PostgreSQL database support
|
||||||
|
|
@ -160,7 +160,7 @@ If you'd rather run cognee-mcp in a container, you have two options:
|
||||||
# With stdio transport (default)
|
# With stdio transport (default)
|
||||||
docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main
|
docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main
|
||||||
```
|
```
|
||||||
|
|
||||||
**With runtime installation of optional dependencies:**
|
**With runtime installation of optional dependencies:**
|
||||||
```bash
|
```bash
|
||||||
# Install optional dependencies from Docker Hub image
|
# Install optional dependencies from Docker Hub image
|
||||||
|
|
@ -357,7 +357,7 @@ You can configure both transports simultaneously for testing:
|
||||||
"url": "http://localhost:8000/sse"
|
"url": "http://localhost:8000/sse"
|
||||||
},
|
},
|
||||||
"cognee-http": {
|
"cognee-http": {
|
||||||
"type": "http",
|
"type": "http",
|
||||||
"url": "http://localhost:8000/mcp"
|
"url": "http://localhost:8000/mcp"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -7,11 +7,11 @@ echo "Environment: $ENVIRONMENT"
|
||||||
# Install optional dependencies if EXTRAS is set
|
# Install optional dependencies if EXTRAS is set
|
||||||
if [ -n "$EXTRAS" ]; then
|
if [ -n "$EXTRAS" ]; then
|
||||||
echo "Installing optional dependencies: $EXTRAS"
|
echo "Installing optional dependencies: $EXTRAS"
|
||||||
|
|
||||||
# Get the cognee version that's currently installed
|
# Get the cognee version that's currently installed
|
||||||
COGNEE_VERSION=$(uv pip show cognee | grep "Version:" | awk '{print $2}')
|
COGNEE_VERSION=$(uv pip show cognee | grep "Version:" | awk '{print $2}')
|
||||||
echo "Current cognee version: $COGNEE_VERSION"
|
echo "Current cognee version: $COGNEE_VERSION"
|
||||||
|
|
||||||
# Build the extras list for cognee
|
# Build the extras list for cognee
|
||||||
IFS=',' read -ra EXTRA_ARRAY <<< "$EXTRAS"
|
IFS=',' read -ra EXTRA_ARRAY <<< "$EXTRAS"
|
||||||
# Combine base extras from pyproject.toml with requested extras
|
# Combine base extras from pyproject.toml with requested extras
|
||||||
|
|
@ -28,11 +28,11 @@ if [ -n "$EXTRAS" ]; then
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
done
|
done
|
||||||
|
|
||||||
echo "Installing cognee with extras: $ALL_EXTRAS"
|
echo "Installing cognee with extras: $ALL_EXTRAS"
|
||||||
echo "Running: uv pip install 'cognee[$ALL_EXTRAS]==$COGNEE_VERSION'"
|
echo "Running: uv pip install 'cognee[$ALL_EXTRAS]==$COGNEE_VERSION'"
|
||||||
uv pip install "cognee[$ALL_EXTRAS]==$COGNEE_VERSION"
|
uv pip install "cognee[$ALL_EXTRAS]==$COGNEE_VERSION"
|
||||||
|
|
||||||
# Verify installation
|
# Verify installation
|
||||||
echo ""
|
echo ""
|
||||||
echo "✓ Optional dependencies installation completed"
|
echo "✓ Optional dependencies installation completed"
|
||||||
|
|
@ -93,19 +93,19 @@ if [ -n "$API_URL" ]; then
|
||||||
if echo "$API_URL" | grep -q "localhost" || echo "$API_URL" | grep -q "127.0.0.1"; then
|
if echo "$API_URL" | grep -q "localhost" || echo "$API_URL" | grep -q "127.0.0.1"; then
|
||||||
echo "⚠️ Warning: API_URL contains localhost/127.0.0.1"
|
echo "⚠️ Warning: API_URL contains localhost/127.0.0.1"
|
||||||
echo " Original: $API_URL"
|
echo " Original: $API_URL"
|
||||||
|
|
||||||
# Try to use host.docker.internal (works on Mac/Windows and recent Linux with Docker Desktop)
|
# Try to use host.docker.internal (works on Mac/Windows and recent Linux with Docker Desktop)
|
||||||
FIXED_API_URL=$(echo "$API_URL" | sed 's/localhost/host.docker.internal/g' | sed 's/127\.0\.0\.1/host.docker.internal/g')
|
FIXED_API_URL=$(echo "$API_URL" | sed 's/localhost/host.docker.internal/g' | sed 's/127\.0\.0\.1/host.docker.internal/g')
|
||||||
|
|
||||||
echo " Converted to: $FIXED_API_URL"
|
echo " Converted to: $FIXED_API_URL"
|
||||||
echo " This will work on Mac/Windows/Docker Desktop."
|
echo " This will work on Mac/Windows/Docker Desktop."
|
||||||
echo " On Linux without Docker Desktop, you may need to:"
|
echo " On Linux without Docker Desktop, you may need to:"
|
||||||
echo " - Use --network host, OR"
|
echo " - Use --network host, OR"
|
||||||
echo " - Set API_URL=http://172.17.0.1:8000 (Docker bridge IP)"
|
echo " - Set API_URL=http://172.17.0.1:8000 (Docker bridge IP)"
|
||||||
|
|
||||||
API_URL="$FIXED_API_URL"
|
API_URL="$FIXED_API_URL"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
API_ARGS="--api-url $API_URL"
|
API_ARGS="--api-url $API_URL"
|
||||||
if [ -n "$API_TOKEN" ]; then
|
if [ -n "$API_TOKEN" ]; then
|
||||||
API_ARGS="$API_ARGS --api-token $API_TOKEN"
|
API_ARGS="$API_ARGS --api-token $API_TOKEN"
|
||||||
|
|
|
||||||
|
|
@ -16,4 +16,4 @@ EMBEDDING_API_VERSION=""
|
||||||
|
|
||||||
|
|
||||||
GRAPHISTRY_USERNAME=""
|
GRAPHISTRY_USERNAME=""
|
||||||
GRAPHISTRY_PASSWORD=""
|
GRAPHISTRY_PASSWORD=""
|
||||||
|
|
|
||||||
|
|
@ -14,7 +14,7 @@ This starter kit is deprecated. Its examples have been integrated into the `/new
|
||||||
# Cognee Starter Kit
|
# Cognee Starter Kit
|
||||||
Welcome to the <a href="https://github.com/topoteretes/cognee">cognee</a> Starter Repo! This repository is designed to help you get started quickly by providing a structured dataset and pre-built data pipelines using cognee to build powerful knowledge graphs.
|
Welcome to the <a href="https://github.com/topoteretes/cognee">cognee</a> Starter Repo! This repository is designed to help you get started quickly by providing a structured dataset and pre-built data pipelines using cognee to build powerful knowledge graphs.
|
||||||
|
|
||||||
You can use this repo to ingest, process, and visualize data in minutes.
|
You can use this repo to ingest, process, and visualize data in minutes.
|
||||||
|
|
||||||
By following this guide, you will:
|
By following this guide, you will:
|
||||||
|
|
||||||
|
|
@ -80,7 +80,7 @@ Custom model uses custom pydantic model for graph extraction. This script catego
|
||||||
python src/pipelines/custom-model.py
|
python src/pipelines/custom-model.py
|
||||||
```
|
```
|
||||||
|
|
||||||
## Graph preview
|
## Graph preview
|
||||||
|
|
||||||
cognee provides a visualize_graph function that will render the graph for you.
|
cognee provides a visualize_graph function that will render the graph for you.
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -71,7 +71,7 @@ def get_sync_router() -> APIRouter:
|
||||||
-H "Content-Type: application/json" \\
|
-H "Content-Type: application/json" \\
|
||||||
-H "Cookie: auth_token=your-token" \\
|
-H "Cookie: auth_token=your-token" \\
|
||||||
-d '{"dataset_ids": ["123e4567-e89b-12d3-a456-426614174000", "456e7890-e12b-34c5-d678-901234567000"]}'
|
-d '{"dataset_ids": ["123e4567-e89b-12d3-a456-426614174000", "456e7890-e12b-34c5-d678-901234567000"]}'
|
||||||
|
|
||||||
# Sync all user datasets (empty request body or null dataset_ids)
|
# Sync all user datasets (empty request body or null dataset_ids)
|
||||||
curl -X POST "http://localhost:8000/api/v1/sync" \\
|
curl -X POST "http://localhost:8000/api/v1/sync" \\
|
||||||
-H "Content-Type: application/json" \\
|
-H "Content-Type: application/json" \\
|
||||||
|
|
@ -88,7 +88,7 @@ def get_sync_router() -> APIRouter:
|
||||||
- **413 Payload Too Large**: Dataset too large for current cloud plan
|
- **413 Payload Too Large**: Dataset too large for current cloud plan
|
||||||
- **429 Too Many Requests**: Rate limit exceeded
|
- **429 Too Many Requests**: Rate limit exceeded
|
||||||
|
|
||||||
## Notes
|
## Notes
|
||||||
- Sync operations run in the background - you get an immediate response
|
- Sync operations run in the background - you get an immediate response
|
||||||
- Use the returned run_id to track progress (status API coming soon)
|
- Use the returned run_id to track progress (status API coming soon)
|
||||||
- Large datasets are automatically chunked for efficient transfer
|
- Large datasets are automatically chunked for efficient transfer
|
||||||
|
|
@ -179,7 +179,7 @@ def get_sync_router() -> APIRouter:
|
||||||
```
|
```
|
||||||
|
|
||||||
## Example Responses
|
## Example Responses
|
||||||
|
|
||||||
**No running syncs:**
|
**No running syncs:**
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
|
|
|
||||||
|
|
@ -21,7 +21,7 @@ binary streams, then stores them in a specified dataset for further processing.
|
||||||
|
|
||||||
Supported Input Types:
|
Supported Input Types:
|
||||||
- **Text strings**: Direct text content
|
- **Text strings**: Direct text content
|
||||||
- **File paths**: Local file paths (absolute paths starting with "/")
|
- **File paths**: Local file paths (absolute paths starting with "/")
|
||||||
- **File URLs**: "file:///absolute/path" or "file://relative/path"
|
- **File URLs**: "file:///absolute/path" or "file://relative/path"
|
||||||
- **S3 paths**: "s3://bucket-name/path/to/file"
|
- **S3 paths**: "s3://bucket-name/path/to/file"
|
||||||
- **Lists**: Multiple files or text strings in a single call
|
- **Lists**: Multiple files or text strings in a single call
|
||||||
|
|
|
||||||
|
|
@ -17,7 +17,7 @@ The `cognee config` command allows you to view and modify configuration settings
|
||||||
|
|
||||||
You can:
|
You can:
|
||||||
- View all current configuration settings
|
- View all current configuration settings
|
||||||
- Get specific configuration values
|
- Get specific configuration values
|
||||||
- Set configuration values
|
- Set configuration values
|
||||||
- Unset (reset to default) specific configuration values
|
- Unset (reset to default) specific configuration values
|
||||||
- Reset all configuration to defaults
|
- Reset all configuration to defaults
|
||||||
|
|
|
||||||
|
|
@ -290,7 +290,7 @@ class NeptuneAnalyticsAdapter(NeptuneGraphDB, VectorDBInterface):
|
||||||
query_string = f"""
|
query_string = f"""
|
||||||
CALL neptune.algo.vectors.topKByEmbeddingWithFiltering({{
|
CALL neptune.algo.vectors.topKByEmbeddingWithFiltering({{
|
||||||
topK: {limit},
|
topK: {limit},
|
||||||
embedding: {embedding},
|
embedding: {embedding},
|
||||||
nodeFilter: {{ equals: {{property: '{self._COLLECTION_PREFIX}', value: '{collection_name}'}} }}
|
nodeFilter: {{ equals: {{property: '{self._COLLECTION_PREFIX}', value: '{collection_name}'}} }}
|
||||||
}}
|
}}
|
||||||
)
|
)
|
||||||
|
|
@ -299,7 +299,7 @@ class NeptuneAnalyticsAdapter(NeptuneGraphDB, VectorDBInterface):
|
||||||
|
|
||||||
if with_vector:
|
if with_vector:
|
||||||
query_string += """
|
query_string += """
|
||||||
WITH node, score, id(node) as node_id
|
WITH node, score, id(node) as node_id
|
||||||
MATCH (n)
|
MATCH (n)
|
||||||
WHERE id(n) = id(node)
|
WHERE id(n) = id(node)
|
||||||
CALL neptune.algo.vectors.get(n)
|
CALL neptune.algo.vectors.get(n)
|
||||||
|
|
|
||||||
|
|
@ -14,6 +14,8 @@ from tenacity import (
|
||||||
)
|
)
|
||||||
import litellm
|
import litellm
|
||||||
import os
|
import os
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
import httpx
|
||||||
from cognee.infrastructure.databases.vector.embeddings.EmbeddingEngine import EmbeddingEngine
|
from cognee.infrastructure.databases.vector.embeddings.EmbeddingEngine import EmbeddingEngine
|
||||||
from cognee.infrastructure.databases.exceptions import EmbeddingException
|
from cognee.infrastructure.databases.exceptions import EmbeddingException
|
||||||
from cognee.infrastructure.llm.tokenizer.HuggingFace import (
|
from cognee.infrastructure.llm.tokenizer.HuggingFace import (
|
||||||
|
|
@ -79,10 +81,26 @@ class LiteLLMEmbeddingEngine(EmbeddingEngine):
|
||||||
enable_mocking = str(enable_mocking).lower()
|
enable_mocking = str(enable_mocking).lower()
|
||||||
self.mock = enable_mocking in ("true", "1", "yes")
|
self.mock = enable_mocking in ("true", "1", "yes")
|
||||||
|
|
||||||
|
# Validate provided custom embedding endpoint early to avoid long hangs later
|
||||||
|
if self.endpoint:
|
||||||
|
try:
|
||||||
|
parsed = urlparse(self.endpoint)
|
||||||
|
except Exception:
|
||||||
|
parsed = None
|
||||||
|
if not parsed or parsed.scheme not in ("http", "https") or not parsed.netloc:
|
||||||
|
logger.error(
|
||||||
|
"Invalid EMBEDDING_ENDPOINT configured: '%s'. Expected a URL starting with http:// or https://",
|
||||||
|
str(self.endpoint),
|
||||||
|
)
|
||||||
|
raise EmbeddingException(
|
||||||
|
"Invalid EMBEDDING_ENDPOINT. Please set a valid URL (e.g., https://host:port) "
|
||||||
|
"via environment variable EMBEDDING_ENDPOINT."
|
||||||
|
)
|
||||||
|
|
||||||
@retry(
|
@retry(
|
||||||
stop=stop_after_delay(128),
|
stop=stop_after_delay(30),
|
||||||
wait=wait_exponential_jitter(2, 128),
|
wait=wait_exponential_jitter(2, 128),
|
||||||
retry=retry_if_not_exception_type(litellm.exceptions.NotFoundError),
|
retry=retry_if_not_exception_type((litellm.exceptions.NotFoundError, EmbeddingException)),
|
||||||
before_sleep=before_sleep_log(logger, logging.DEBUG),
|
before_sleep=before_sleep_log(logger, logging.DEBUG),
|
||||||
reraise=True,
|
reraise=True,
|
||||||
)
|
)
|
||||||
|
|
@ -111,12 +129,16 @@ class LiteLLMEmbeddingEngine(EmbeddingEngine):
|
||||||
return [data["embedding"] for data in response["data"]]
|
return [data["embedding"] for data in response["data"]]
|
||||||
else:
|
else:
|
||||||
async with embedding_rate_limiter_context_manager():
|
async with embedding_rate_limiter_context_manager():
|
||||||
response = await litellm.aembedding(
|
# Ensure each attempt does not hang indefinitely
|
||||||
model=self.model,
|
response = await asyncio.wait_for(
|
||||||
input=text,
|
litellm.aembedding(
|
||||||
api_key=self.api_key,
|
model=self.model,
|
||||||
api_base=self.endpoint,
|
input=text,
|
||||||
api_version=self.api_version,
|
api_key=self.api_key,
|
||||||
|
api_base=self.endpoint,
|
||||||
|
api_version=self.api_version,
|
||||||
|
),
|
||||||
|
timeout=30.0,
|
||||||
)
|
)
|
||||||
|
|
||||||
return [data["embedding"] for data in response.data]
|
return [data["embedding"] for data in response.data]
|
||||||
|
|
@ -154,6 +176,27 @@ class LiteLLMEmbeddingEngine(EmbeddingEngine):
|
||||||
logger.error("Context window exceeded for embedding text: %s", str(error))
|
logger.error("Context window exceeded for embedding text: %s", str(error))
|
||||||
raise error
|
raise error
|
||||||
|
|
||||||
|
except asyncio.TimeoutError as e:
|
||||||
|
# Per-attempt timeout – likely an unreachable endpoint
|
||||||
|
logger.error(
|
||||||
|
"Embedding endpoint timed out. EMBEDDING_ENDPOINT='%s'. "
|
||||||
|
"Verify that the endpoint is reachable and correct.",
|
||||||
|
str(self.endpoint),
|
||||||
|
)
|
||||||
|
raise EmbeddingException(
|
||||||
|
"Embedding request timed out. Check EMBEDDING_ENDPOINT connectivity."
|
||||||
|
) from e
|
||||||
|
|
||||||
|
except (httpx.ConnectError, httpx.ReadTimeout) as e:
|
||||||
|
logger.error(
|
||||||
|
"Failed to connect to embedding endpoint. EMBEDDING_ENDPOINT='%s'. "
|
||||||
|
"Ensure the URL is correct and the server is running.",
|
||||||
|
str(self.endpoint),
|
||||||
|
)
|
||||||
|
raise EmbeddingException(
|
||||||
|
"Cannot connect to embedding endpoint. Check EMBEDDING_ENDPOINT."
|
||||||
|
) from e
|
||||||
|
|
||||||
except (
|
except (
|
||||||
litellm.exceptions.BadRequestError,
|
litellm.exceptions.BadRequestError,
|
||||||
litellm.exceptions.NotFoundError,
|
litellm.exceptions.NotFoundError,
|
||||||
|
|
@ -162,8 +205,15 @@ class LiteLLMEmbeddingEngine(EmbeddingEngine):
|
||||||
raise EmbeddingException(f"Failed to index data points using model {self.model}") from e
|
raise EmbeddingException(f"Failed to index data points using model {self.model}") from e
|
||||||
|
|
||||||
except Exception as error:
|
except Exception as error:
|
||||||
logger.error("Error embedding text: %s", str(error))
|
# Fall back to a clear, actionable message for connectivity/misconfiguration issues
|
||||||
raise error
|
logger.error(
|
||||||
|
"Error embedding text: %s. EMBEDDING_ENDPOINT='%s'.",
|
||||||
|
str(error),
|
||||||
|
str(self.endpoint),
|
||||||
|
)
|
||||||
|
raise EmbeddingException(
|
||||||
|
"Embedding failed due to an unexpected error. Verify EMBEDDING_ENDPOINT and provider settings."
|
||||||
|
) from error
|
||||||
|
|
||||||
def get_vector_size(self) -> int:
|
def get_vector_size(self) -> int:
|
||||||
"""
|
"""
|
||||||
|
|
|
||||||
|
|
@ -10,4 +10,4 @@ Extraction rules:
|
||||||
5. Current-time references ("now", "current", "today"): If the query explicitly refers to the present, set both starts_at and ends_at to now (the ingestion timestamp).
|
5. Current-time references ("now", "current", "today"): If the query explicitly refers to the present, set both starts_at and ends_at to now (the ingestion timestamp).
|
||||||
6. "Who is" and "Who was" questions: These imply a general identity or biographical inquiry without a specific temporal scope. Set both starts_at and ends_at to None.
|
6. "Who is" and "Who was" questions: These imply a general identity or biographical inquiry without a specific temporal scope. Set both starts_at and ends_at to None.
|
||||||
7. Ordering rule: Always ensure the earlier date is assigned to starts_at and the later date to ends_at.
|
7. Ordering rule: Always ensure the earlier date is assigned to starts_at and the later date to ends_at.
|
||||||
8. No temporal information: If no valid or inferable time reference is found, set both starts_at and ends_at to None.
|
8. No temporal information: If no valid or inferable time reference is found, set both starts_at and ends_at to None.
|
||||||
|
|
|
||||||
|
|
@ -22,4 +22,4 @@ The `attributes` should be a list of dictionaries, each containing:
|
||||||
- Relationships should be technical with one or at most two words. If two words, use underscore camelcase style
|
- Relationships should be technical with one or at most two words. If two words, use underscore camelcase style
|
||||||
- Relationships could imply general meaning like: subject, object, participant, recipient, agent, instrument, tool, source, cause, effect, purpose, manner, resource, etc.
|
- Relationships could imply general meaning like: subject, object, participant, recipient, agent, instrument, tool, source, cause, effect, purpose, manner, resource, etc.
|
||||||
- You can combine two words to form a relationship name: subject_role, previous_owner, etc.
|
- You can combine two words to form a relationship name: subject_role, previous_owner, etc.
|
||||||
- Focus on how the entity specifically relates to the event
|
- Focus on how the entity specifically relates to the event
|
||||||
|
|
|
||||||
|
|
@ -27,4 +27,4 @@ class Event(BaseModel):
|
||||||
time_from: Optional[Timestamp] = None
|
time_from: Optional[Timestamp] = None
|
||||||
time_to: Optional[Timestamp] = None
|
time_to: Optional[Timestamp] = None
|
||||||
location: Optional[str] = None
|
location: Optional[str] = None
|
||||||
```
|
```
|
||||||
|
|
|
||||||
|
|
@ -19,8 +19,8 @@ The aim is to achieve simplicity and clarity in the knowledge graph.
|
||||||
- **Naming Convention**: Use snake_case for relationship names, e.g., `acted_in`.
|
- **Naming Convention**: Use snake_case for relationship names, e.g., `acted_in`.
|
||||||
# 3. Coreference Resolution
|
# 3. Coreference Resolution
|
||||||
- **Maintain Entity Consistency**: When extracting entities, it's vital to ensure consistency.
|
- **Maintain Entity Consistency**: When extracting entities, it's vital to ensure consistency.
|
||||||
If an entity, such as "John Doe", is mentioned multiple times in the text but is referred to by different names or pronouns (e.g., "Joe", "he"),
|
If an entity, is mentioned multiple times in the text but is referred to by different names or pronouns,
|
||||||
always use the most complete identifier for that entity throughout the knowledge graph. In this example, use "John Doe" as the Persons ID.
|
always use the most complete identifier for that entity throughout the knowledge graph.
|
||||||
Remember, the knowledge graph should be coherent and easily understandable, so maintaining consistency in entity references is crucial.
|
Remember, the knowledge graph should be coherent and easily understandable, so maintaining consistency in entity references is crucial.
|
||||||
# 4. Strict Compliance
|
# 4. Strict Compliance
|
||||||
Adhere to the rules strictly. Non-compliance will result in termination
|
Adhere to the rules strictly. Non-compliance will result in termination
|
||||||
|
|
|
||||||
|
|
@ -22,7 +22,7 @@ You are an advanced algorithm designed to extract structured information to buil
|
||||||
3. **Coreference Resolution**:
|
3. **Coreference Resolution**:
|
||||||
- Maintain one consistent node ID for each real-world entity.
|
- Maintain one consistent node ID for each real-world entity.
|
||||||
- Resolve aliases, acronyms, and pronouns to the most complete form.
|
- Resolve aliases, acronyms, and pronouns to the most complete form.
|
||||||
- *Example*: Always use "John Doe" even if later referred to as "Doe" or "he".
|
- *Example*: Always use full identifier even if later referred to as in a similar but slightly different way
|
||||||
|
|
||||||
**Property & Data Guidelines**:
|
**Property & Data Guidelines**:
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -42,10 +42,10 @@ You are an advanced algorithm designed to extract structured information from un
|
||||||
- **Rule**: Resolve all aliases, acronyms, and pronouns to one canonical identifier.
|
- **Rule**: Resolve all aliases, acronyms, and pronouns to one canonical identifier.
|
||||||
|
|
||||||
> **One-Shot Example**:
|
> **One-Shot Example**:
|
||||||
> **Input**: "John Doe is an author. Later, Doe published a book. He is well-known."
|
> **Input**: "X is an author. Later, Doe published a book. He is well-known."
|
||||||
> **Output Node**:
|
> **Output Node**:
|
||||||
> ```
|
> ```
|
||||||
> John Doe (Person)
|
> X (Person)
|
||||||
> ```
|
> ```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
|
||||||
|
|
@ -15,7 +15,7 @@ You are an advanced algorithm that extracts structured data into a knowledge gra
|
||||||
- Properties are key-value pairs; do not use escaped quotes.
|
- Properties are key-value pairs; do not use escaped quotes.
|
||||||
|
|
||||||
3. **Coreference Resolution**
|
3. **Coreference Resolution**
|
||||||
- Use a single, complete identifier for each entity (e.g., always "John Doe" not "Joe" or "he").
|
- Use a single, complete identifier for each entity
|
||||||
|
|
||||||
4. **Relationship Labels**:
|
4. **Relationship Labels**:
|
||||||
- Use descriptive, lowercase, snake_case names for edges.
|
- Use descriptive, lowercase, snake_case names for edges.
|
||||||
|
|
|
||||||
|
|
@ -26,7 +26,7 @@ Use **basic atomic types** for node labels. Always prefer general types over spe
|
||||||
- Good: "Alan Turing", "Google Inc.", "World War II"
|
- Good: "Alan Turing", "Google Inc.", "World War II"
|
||||||
- Bad: "Entity_001", "1234", "he", "they"
|
- Bad: "Entity_001", "1234", "he", "they"
|
||||||
- Never use numeric or autogenerated IDs.
|
- Never use numeric or autogenerated IDs.
|
||||||
- Prioritize **most complete form** of entity names for consistency (e.g., always use "John Doe" instead of "John" or "he").
|
- Prioritize **most complete form** of entity names for consistency
|
||||||
|
|
||||||
2. Dates, Numbers, and Properties
|
2. Dates, Numbers, and Properties
|
||||||
---------------------------------
|
---------------------------------
|
||||||
|
|
|
||||||
|
|
@ -2,12 +2,12 @@ You are an expert query analyzer for a **GraphRAG system**. Your primary goal is
|
||||||
|
|
||||||
Here are the available `SearchType` tools and their specific functions:
|
Here are the available `SearchType` tools and their specific functions:
|
||||||
|
|
||||||
- **`SUMMARIES`**: The `SUMMARIES` search type retrieves summarized information from the knowledge graph.
|
- **`SUMMARIES`**: The `SUMMARIES` search type retrieves summarized information from the knowledge graph.
|
||||||
|
|
||||||
**Best for:**
|
**Best for:**
|
||||||
|
|
||||||
- Getting concise overviews of topics
|
- Getting concise overviews of topics
|
||||||
- Summarizing large amounts of information
|
- Summarizing large amounts of information
|
||||||
- Quick understanding of complex subjects
|
- Quick understanding of complex subjects
|
||||||
|
|
||||||
**Best for:**
|
**Best for:**
|
||||||
|
|
@ -16,7 +16,7 @@ Here are the available `SearchType` tools and their specific functions:
|
||||||
- Understanding relationships between concepts
|
- Understanding relationships between concepts
|
||||||
- Exploring the structure of your knowledge graph
|
- Exploring the structure of your knowledge graph
|
||||||
|
|
||||||
* **`CHUNKS`**: The `CHUNKS` search type retrieves specific facts and information chunks from the knowledge graph.
|
* **`CHUNKS`**: The `CHUNKS` search type retrieves specific facts and information chunks from the knowledge graph.
|
||||||
|
|
||||||
**Best for:**
|
**Best for:**
|
||||||
|
|
||||||
|
|
@ -122,4 +122,4 @@ Response: `NATURAL_LANGUAGE`
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
Your response MUST be a single word, consisting of only the chosen `SearchType` name. Do not provide any explanation.
|
Your response MUST be a single word, consisting of only the chosen `SearchType` name. Do not provide any explanation.
|
||||||
|
|
|
||||||
|
|
@ -1 +1 @@
|
||||||
Respond with: test
|
Respond with: test
|
||||||
|
|
|
||||||
|
|
@ -34,6 +34,7 @@ class LLMProvider(Enum):
|
||||||
GEMINI = "gemini"
|
GEMINI = "gemini"
|
||||||
MISTRAL = "mistral"
|
MISTRAL = "mistral"
|
||||||
BEDROCK = "bedrock"
|
BEDROCK = "bedrock"
|
||||||
|
LLAMA_CPP = "llama_cpp"
|
||||||
|
|
||||||
|
|
||||||
def get_llm_client(raise_api_key_error: bool = True):
|
def get_llm_client(raise_api_key_error: bool = True):
|
||||||
|
|
@ -187,5 +188,28 @@ def get_llm_client(raise_api_key_error: bool = True):
|
||||||
instructor_mode=llm_config.llm_instructor_mode.lower(),
|
instructor_mode=llm_config.llm_instructor_mode.lower(),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
elif provider == LLMProvider.LLAMA_CPP:
|
||||||
|
from cognee.infrastructure.llm.structured_output_framework.litellm_instructor.llm.llama_cpp.adapter import (
|
||||||
|
LlamaCppAPIAdapter,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get optional local mode parameters (will be None if not set)
|
||||||
|
# TODO: refactor llm_config to include these parameters, currently they cannot be defined and defaults are used
|
||||||
|
model_path = getattr(llm_config, "llama_cpp_model_path", None)
|
||||||
|
n_ctx = getattr(llm_config, "llama_cpp_n_ctx", 2048)
|
||||||
|
n_gpu_layers = getattr(llm_config, "llama_cpp_n_gpu_layers", 0)
|
||||||
|
chat_format = getattr(llm_config, "llama_cpp_chat_format", "chatml")
|
||||||
|
|
||||||
|
return LlamaCppAPIAdapter(
|
||||||
|
model=llm_config.llm_model,
|
||||||
|
max_completion_tokens=max_completion_tokens,
|
||||||
|
instructor_mode=llm_config.llm_instructor_mode.lower(),
|
||||||
|
endpoint=llm_config.llm_endpoint,
|
||||||
|
api_key=llm_config.llm_api_key,
|
||||||
|
model_path=model_path,
|
||||||
|
n_ctx=n_ctx,
|
||||||
|
n_gpu_layers=n_gpu_layers,
|
||||||
|
chat_format=chat_format,
|
||||||
|
)
|
||||||
else:
|
else:
|
||||||
raise UnsupportedLLMProviderError(provider)
|
raise UnsupportedLLMProviderError(provider)
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,191 @@
|
||||||
|
"""Adapter for Instructor-backed Structured Output Framework for Llama CPP"""
|
||||||
|
|
||||||
|
import litellm
|
||||||
|
import logging
|
||||||
|
import instructor
|
||||||
|
from typing import Type, Optional
|
||||||
|
from openai import AsyncOpenAI
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
from cognee.infrastructure.llm.structured_output_framework.litellm_instructor.llm.llm_interface import (
|
||||||
|
LLMInterface,
|
||||||
|
)
|
||||||
|
from cognee.shared.logging_utils import get_logger
|
||||||
|
from cognee.shared.rate_limiting import llm_rate_limiter_context_manager
|
||||||
|
|
||||||
|
from tenacity import (
|
||||||
|
retry,
|
||||||
|
stop_after_delay,
|
||||||
|
wait_exponential_jitter,
|
||||||
|
retry_if_not_exception_type,
|
||||||
|
before_sleep_log,
|
||||||
|
)
|
||||||
|
|
||||||
|
logger = get_logger()
|
||||||
|
|
||||||
|
|
||||||
|
class LlamaCppAPIAdapter(LLMInterface):
|
||||||
|
"""
|
||||||
|
Adapter for Llama CPP LLM provider with support for TWO modes:
|
||||||
|
|
||||||
|
1. SERVER MODE (OpenAI-compatible):
|
||||||
|
- Connects to llama-cpp-python server via HTTP (local or remote)
|
||||||
|
- Uses instructor.from_openai()
|
||||||
|
- Requires: endpoint, api_key, model
|
||||||
|
|
||||||
|
2. LOCAL MODE (In-process):
|
||||||
|
- Loads model directly using llama-cpp-python library
|
||||||
|
- Uses instructor.patch() on llama.Llama object
|
||||||
|
- Requires: model_path
|
||||||
|
|
||||||
|
Public methods:
|
||||||
|
- acreate_structured_output
|
||||||
|
|
||||||
|
Instance variables:
|
||||||
|
- name
|
||||||
|
- model (for server mode) or model_path (for local mode)
|
||||||
|
- mode_type: "server" or "local"
|
||||||
|
- max_completion_tokens
|
||||||
|
- aclient
|
||||||
|
"""
|
||||||
|
|
||||||
|
name: str
|
||||||
|
model: Optional[str]
|
||||||
|
model_path: Optional[str]
|
||||||
|
mode_type: str # "server" or "local"
|
||||||
|
default_instructor_mode = instructor.Mode.JSON
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
name: str = "LlamaCpp",
|
||||||
|
max_completion_tokens: int = 2048,
|
||||||
|
instructor_mode: Optional[str] = None,
|
||||||
|
# Server mode parameters
|
||||||
|
endpoint: Optional[str] = None,
|
||||||
|
api_key: Optional[str] = None,
|
||||||
|
model: Optional[str] = None,
|
||||||
|
# Local mode parameters
|
||||||
|
model_path: Optional[str] = None,
|
||||||
|
n_ctx: int = 2048,
|
||||||
|
n_gpu_layers: int = 0,
|
||||||
|
chat_format: str = "chatml",
|
||||||
|
):
|
||||||
|
self.name = name
|
||||||
|
self.max_completion_tokens = max_completion_tokens
|
||||||
|
self.instructor_mode = instructor_mode if instructor_mode else self.default_instructor_mode
|
||||||
|
|
||||||
|
# Determine which mode to use
|
||||||
|
if model_path:
|
||||||
|
self._init_local_mode(model_path, n_ctx, n_gpu_layers, chat_format)
|
||||||
|
elif endpoint:
|
||||||
|
self._init_server_mode(endpoint, api_key, model)
|
||||||
|
else:
|
||||||
|
raise ValueError(
|
||||||
|
"Must provide either 'model_path' (for local mode) or 'endpoint' (for server mode)"
|
||||||
|
)
|
||||||
|
|
||||||
|
def _init_local_mode(self, model_path: str, n_ctx: int, n_gpu_layers: int, chat_format: str):
|
||||||
|
"""Initialize local mode using llama-cpp-python library directly"""
|
||||||
|
try:
|
||||||
|
import llama_cpp
|
||||||
|
except ImportError:
|
||||||
|
raise ImportError(
|
||||||
|
"llama-cpp-python is not installed. Install with: pip install llama-cpp-python"
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.info(f"Initializing LlamaCpp in LOCAL mode with model: {model_path}")
|
||||||
|
|
||||||
|
self.mode_type = "local"
|
||||||
|
self.model_path = model_path
|
||||||
|
self.model = None
|
||||||
|
|
||||||
|
# Initialize llama-cpp-python with the model
|
||||||
|
self.llama = llama_cpp.Llama(
|
||||||
|
model_path=model_path,
|
||||||
|
n_gpu_layers=n_gpu_layers, # -1 for all GPU, 0 for CPU only
|
||||||
|
chat_format=chat_format,
|
||||||
|
n_ctx=n_ctx,
|
||||||
|
verbose=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.aclient = instructor.patch(
|
||||||
|
create=self.llama.create_chat_completion_openai_v1,
|
||||||
|
mode=instructor.Mode(self.instructor_mode),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _init_server_mode(self, endpoint: str, api_key: Optional[str], model: Optional[str]):
|
||||||
|
"""Initialize server mode connecting to llama-cpp-python server"""
|
||||||
|
logger.info(f"Initializing LlamaCpp in SERVER mode with endpoint: {endpoint}")
|
||||||
|
|
||||||
|
self.mode_type = "server"
|
||||||
|
self.model = model
|
||||||
|
self.model_path = None
|
||||||
|
self.endpoint = endpoint
|
||||||
|
self.api_key = api_key
|
||||||
|
|
||||||
|
# Use instructor.from_openai() for server mode (OpenAI-compatible API)
|
||||||
|
self.aclient = instructor.from_openai(
|
||||||
|
AsyncOpenAI(base_url=self.endpoint, api_key=self.api_key),
|
||||||
|
mode=instructor.Mode(self.instructor_mode),
|
||||||
|
)
|
||||||
|
|
||||||
|
@retry(
|
||||||
|
stop=stop_after_delay(128),
|
||||||
|
wait=wait_exponential_jitter(8, 128),
|
||||||
|
retry=retry_if_not_exception_type(litellm.exceptions.NotFoundError),
|
||||||
|
before_sleep=before_sleep_log(logger, logging.DEBUG),
|
||||||
|
reraise=True,
|
||||||
|
)
|
||||||
|
async def acreate_structured_output(
|
||||||
|
self, text_input: str, system_prompt: str, response_model: Type[BaseModel], **kwargs
|
||||||
|
) -> BaseModel:
|
||||||
|
"""
|
||||||
|
Generate a structured output from the LLM using the provided text and system prompt.
|
||||||
|
|
||||||
|
Works in both local and server modes transparently.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
-----------
|
||||||
|
- text_input (str): The input text provided by the user.
|
||||||
|
- system_prompt (str): The system prompt that guides the response generation.
|
||||||
|
- response_model (Type[BaseModel]): The model type that the response should conform to.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
--------
|
||||||
|
- BaseModel: A structured output that conforms to the specified response model.
|
||||||
|
"""
|
||||||
|
async with llm_rate_limiter_context_manager():
|
||||||
|
# Prepare messages (system first, then user is more standard)
|
||||||
|
messages = [
|
||||||
|
{"role": "system", "content": system_prompt},
|
||||||
|
{"role": "user", "content": text_input},
|
||||||
|
]
|
||||||
|
|
||||||
|
if self.mode_type == "server":
|
||||||
|
# Server mode: use async client with OpenAI-compatible API
|
||||||
|
response = await self.aclient.chat.completions.create(
|
||||||
|
model=self.model,
|
||||||
|
messages=messages,
|
||||||
|
response_model=response_model,
|
||||||
|
max_retries=2,
|
||||||
|
max_completion_tokens=self.max_completion_tokens,
|
||||||
|
**kwargs,
|
||||||
|
)
|
||||||
|
|
||||||
|
else:
|
||||||
|
import asyncio
|
||||||
|
|
||||||
|
# Local mode: instructor.patch() returns a SYNC callable
|
||||||
|
# Per docs: https://python.useinstructor.com/integrations/llama-cpp-python/
|
||||||
|
def _call_sync():
|
||||||
|
return self.aclient(
|
||||||
|
messages=messages,
|
||||||
|
response_model=response_model,
|
||||||
|
max_tokens=self.max_completion_tokens,
|
||||||
|
**kwargs,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Run sync function in thread pool to avoid blocking
|
||||||
|
response = await asyncio.to_thread(_call_sync)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
@ -3,3 +3,4 @@ from .get_notebooks import get_notebooks
|
||||||
from .create_notebook import create_notebook
|
from .create_notebook import create_notebook
|
||||||
from .update_notebook import update_notebook
|
from .update_notebook import update_notebook
|
||||||
from .delete_notebook import delete_notebook
|
from .delete_notebook import delete_notebook
|
||||||
|
from .create_tutorial_notebooks import create_tutorial_notebooks
|
||||||
|
|
|
||||||
|
|
@ -6,40 +6,6 @@ from cognee.infrastructure.databases.relational import with_async_session
|
||||||
|
|
||||||
from ..models.Notebook import Notebook, NotebookCell
|
from ..models.Notebook import Notebook, NotebookCell
|
||||||
|
|
||||||
TUTORIAL_NOTEBOOK_NAME = "Python Development with Cognee Tutorial 🧠"
|
|
||||||
|
|
||||||
|
|
||||||
async def _create_tutorial_notebook(
|
|
||||||
user_id: UUID, session: AsyncSession, force_refresh: bool = False
|
|
||||||
) -> None:
|
|
||||||
"""
|
|
||||||
Create the default tutorial notebook for new users.
|
|
||||||
Dynamically fetches from: https://github.com/topoteretes/cognee/blob/notebook_tutorial/notebooks/starter_tutorial.zip
|
|
||||||
"""
|
|
||||||
TUTORIAL_ZIP_URL = (
|
|
||||||
"https://github.com/topoteretes/cognee/raw/notebook_tutorial/notebooks/starter_tutorial.zip"
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Create notebook from remote zip file (includes notebook + data files)
|
|
||||||
notebook = await Notebook.from_ipynb_zip_url(
|
|
||||||
zip_url=TUTORIAL_ZIP_URL,
|
|
||||||
owner_id=user_id,
|
|
||||||
notebook_filename="tutorial.ipynb",
|
|
||||||
name=TUTORIAL_NOTEBOOK_NAME,
|
|
||||||
deletable=False,
|
|
||||||
force=force_refresh,
|
|
||||||
)
|
|
||||||
|
|
||||||
# Add to session and commit
|
|
||||||
session.add(notebook)
|
|
||||||
await session.commit()
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
print(f"Failed to fetch tutorial notebook from {TUTORIAL_ZIP_URL}: {e}")
|
|
||||||
|
|
||||||
raise e
|
|
||||||
|
|
||||||
|
|
||||||
@with_async_session
|
@with_async_session
|
||||||
async def create_notebook(
|
async def create_notebook(
|
||||||
|
|
|
||||||
191
cognee/modules/notebooks/methods/create_tutorial_notebooks.py
Normal file
191
cognee/modules/notebooks/methods/create_tutorial_notebooks.py
Normal file
|
|
@ -0,0 +1,191 @@
|
||||||
|
from pathlib import Path
|
||||||
|
from uuid import NAMESPACE_OID, UUID, uuid5, uuid4
|
||||||
|
from typing import List, Optional, Dict, Any
|
||||||
|
import re
|
||||||
|
import json
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from cognee.shared.logging_utils import get_logger
|
||||||
|
from cognee.root_dir import ROOT_DIR
|
||||||
|
|
||||||
|
from ..models.Notebook import Notebook, NotebookCell
|
||||||
|
|
||||||
|
logger = get_logger()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_tutorials_directory() -> Path:
|
||||||
|
"""Get the path to the tutorials directory."""
|
||||||
|
return ROOT_DIR / "modules" / "notebooks" / "tutorials"
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_cell_index(filename: str) -> int:
|
||||||
|
"""Extract cell index from filename like 'cell-0.md' or 'cell-123.py'."""
|
||||||
|
match = re.search(r"cell-(\d+)", filename)
|
||||||
|
if match:
|
||||||
|
return int(match.group(1))
|
||||||
|
return -1
|
||||||
|
|
||||||
|
|
||||||
|
def _get_cell_type(file_path: Path) -> str:
|
||||||
|
"""Determine cell type from file extension."""
|
||||||
|
extension = file_path.suffix.lower()
|
||||||
|
if extension == ".md":
|
||||||
|
return "markdown"
|
||||||
|
elif extension == ".py":
|
||||||
|
return "code"
|
||||||
|
else:
|
||||||
|
raise ValueError(f"Unsupported cell file type: {extension}")
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_markdown_heading(content: str) -> str | None:
|
||||||
|
"""Extract the first markdown heading from content."""
|
||||||
|
for line in content.splitlines():
|
||||||
|
line = line.strip()
|
||||||
|
# Match lines starting with one or more # followed by space and text
|
||||||
|
match = re.match(r"^#+\s+(.+)$", line)
|
||||||
|
if match:
|
||||||
|
return match.group(1).strip()
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _get_cell_name(cell_file: Path, cell_type: str, content: str) -> str:
|
||||||
|
"""Get the appropriate name for a cell."""
|
||||||
|
if cell_type == "code":
|
||||||
|
return "Code Cell"
|
||||||
|
elif cell_type == "markdown":
|
||||||
|
heading = _extract_markdown_heading(content)
|
||||||
|
if heading:
|
||||||
|
return heading
|
||||||
|
# Fallback to filename stem
|
||||||
|
return cell_file.stem
|
||||||
|
|
||||||
|
|
||||||
|
def _load_tutorial_cells(tutorial_dir: Path) -> List[NotebookCell]:
|
||||||
|
"""Load all cells from a tutorial directory, sorted by cell index."""
|
||||||
|
cells = []
|
||||||
|
|
||||||
|
cell_files = [
|
||||||
|
file_path
|
||||||
|
for file_path in tutorial_dir.iterdir()
|
||||||
|
if file_path.is_file()
|
||||||
|
and file_path.name.startswith("cell-")
|
||||||
|
and file_path.suffix in [".md", ".py"]
|
||||||
|
]
|
||||||
|
|
||||||
|
cell_files.sort(key=lambda file_path: _parse_cell_index(file_path.name))
|
||||||
|
|
||||||
|
for cell_file in cell_files:
|
||||||
|
try:
|
||||||
|
cell_type = _get_cell_type(cell_file)
|
||||||
|
content = cell_file.read_text(encoding="utf-8")
|
||||||
|
cell_name = _get_cell_name(cell_file, cell_type, content)
|
||||||
|
|
||||||
|
cells.append(
|
||||||
|
NotebookCell(
|
||||||
|
id=uuid4(),
|
||||||
|
type=cell_type,
|
||||||
|
name=cell_name,
|
||||||
|
content=content,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Failed to load cell {cell_file}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
return cells
|
||||||
|
|
||||||
|
|
||||||
|
def _read_tutorial_config(tutorial_dir: Path) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Read config.json from a tutorial directory if it exists."""
|
||||||
|
config_path = tutorial_dir / "config.json"
|
||||||
|
if config_path.exists():
|
||||||
|
try:
|
||||||
|
with open(config_path, "r", encoding="utf-8") as f:
|
||||||
|
return json.load(f)
|
||||||
|
except (json.JSONDecodeError, IOError) as e:
|
||||||
|
logger.warning(f"Failed to read config.json from {tutorial_dir}: {e}")
|
||||||
|
return None
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _format_tutorial_name(tutorial_dir_name: str) -> str:
|
||||||
|
"""Format tutorial directory name into a readable notebook name (fallback)."""
|
||||||
|
|
||||||
|
name = tutorial_dir_name.replace("-", " ").replace("_", " ")
|
||||||
|
return f"{name.capitalize()} - tutorial 🧠"
|
||||||
|
|
||||||
|
|
||||||
|
async def create_tutorial_notebooks(user_id: UUID, session: AsyncSession) -> None:
|
||||||
|
"""
|
||||||
|
Create tutorial notebooks for all tutorials found in the tutorials directory.
|
||||||
|
Each tutorial directory will become a separate notebook.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
tutorials_dir = _get_tutorials_directory()
|
||||||
|
|
||||||
|
if not tutorials_dir.exists():
|
||||||
|
logger.warning(f"Tutorials directory not found: {tutorials_dir}")
|
||||||
|
return
|
||||||
|
|
||||||
|
tutorial_dirs = [
|
||||||
|
d for d in tutorials_dir.iterdir() if d.is_dir() and not d.name.startswith(".")
|
||||||
|
]
|
||||||
|
|
||||||
|
if not tutorial_dirs:
|
||||||
|
logger.warning(f"No tutorial directories found in {tutorials_dir}")
|
||||||
|
return
|
||||||
|
|
||||||
|
notebooks_to_add = []
|
||||||
|
|
||||||
|
for tutorial_dir in tutorial_dirs:
|
||||||
|
try:
|
||||||
|
cells = _load_tutorial_cells(tutorial_dir)
|
||||||
|
|
||||||
|
if not cells:
|
||||||
|
logger.warning(f"No cells found in tutorial directory: {tutorial_dir}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
config = _read_tutorial_config(tutorial_dir)
|
||||||
|
|
||||||
|
# Use name from config.json, or fallback to formatted directory name
|
||||||
|
if config and "name" in config:
|
||||||
|
notebook_name = config["name"]
|
||||||
|
else:
|
||||||
|
notebook_name = _format_tutorial_name(tutorial_dir.name)
|
||||||
|
logger.warning(
|
||||||
|
f"No config.json or 'name' field found in {tutorial_dir}, "
|
||||||
|
f"using fallback name: {notebook_name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Use deletable flag from config.json, or default to False for tutorials
|
||||||
|
deletable = False
|
||||||
|
if config and "deletable" in config:
|
||||||
|
deletable = bool(config["deletable"])
|
||||||
|
|
||||||
|
notebook_id = uuid5(NAMESPACE_OID, name=notebook_name)
|
||||||
|
|
||||||
|
notebook = Notebook(
|
||||||
|
id=notebook_id,
|
||||||
|
owner_id=user_id,
|
||||||
|
name=notebook_name,
|
||||||
|
cells=cells,
|
||||||
|
deletable=deletable,
|
||||||
|
)
|
||||||
|
|
||||||
|
notebooks_to_add.append(notebook)
|
||||||
|
logger.info(f"Created tutorial notebook: {notebook_name} with {len(cells)} cells")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to create tutorial notebook from {tutorial_dir}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not notebooks_to_add:
|
||||||
|
return
|
||||||
|
|
||||||
|
for notebook in notebooks_to_add:
|
||||||
|
session.add(notebook)
|
||||||
|
|
||||||
|
await session.commit()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to create tutorial notebooks for user {user_id}: {e}")
|
||||||
|
|
@ -1,4 +1,4 @@
|
||||||
from uuid import UUID
|
from uuid import NAMESPACE_OID, UUID, uuid5
|
||||||
from typing import List
|
from typing import List
|
||||||
from sqlalchemy import select, and_
|
from sqlalchemy import select, and_
|
||||||
from sqlalchemy.ext.asyncio import AsyncSession
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
@ -6,7 +6,7 @@ from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
from cognee.infrastructure.databases.relational import with_async_session
|
from cognee.infrastructure.databases.relational import with_async_session
|
||||||
|
|
||||||
from ..models.Notebook import Notebook
|
from ..models.Notebook import Notebook
|
||||||
from .create_notebook import _create_tutorial_notebook, TUTORIAL_NOTEBOOK_NAME
|
from .create_tutorial_notebooks import create_tutorial_notebooks
|
||||||
|
|
||||||
from cognee.shared.logging_utils import get_logger
|
from cognee.shared.logging_utils import get_logger
|
||||||
|
|
||||||
|
|
@ -19,21 +19,25 @@ async def get_notebooks(
|
||||||
session: AsyncSession,
|
session: AsyncSession,
|
||||||
) -> List[Notebook]:
|
) -> List[Notebook]:
|
||||||
# Check if tutorial notebook already exists for this user
|
# Check if tutorial notebook already exists for this user
|
||||||
|
tutorial_notebook_ids = [
|
||||||
|
uuid5(NAMESPACE_OID, name="Cognee Basics - tutorial 🧠"),
|
||||||
|
uuid5(NAMESPACE_OID, name="Python Development with Cognee - tutorial 🧠"),
|
||||||
|
]
|
||||||
tutorial_query = select(Notebook).where(
|
tutorial_query = select(Notebook).where(
|
||||||
and_(
|
and_(
|
||||||
Notebook.owner_id == user_id,
|
Notebook.owner_id == user_id,
|
||||||
Notebook.name == TUTORIAL_NOTEBOOK_NAME,
|
Notebook.id.in_(tutorial_notebook_ids),
|
||||||
~Notebook.deletable,
|
~Notebook.deletable,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
tutorial_result = await session.execute(tutorial_query)
|
tutorial_result = await session.execute(tutorial_query)
|
||||||
tutorial_notebook = tutorial_result.scalar_one_or_none()
|
tutorial_notebooks = tutorial_result.scalars().all()
|
||||||
|
|
||||||
# If tutorial notebook doesn't exist, create it
|
# If tutorial notebooks don't exist, create them
|
||||||
if tutorial_notebook is None:
|
if len(tutorial_notebooks) == 0:
|
||||||
logger.info(f"Tutorial notebook not found for user {user_id}, creating it")
|
logger.info(f"Tutorial notebooks not found for user {user_id}, creating them")
|
||||||
try:
|
try:
|
||||||
await _create_tutorial_notebook(user_id, session, force_refresh=False)
|
await create_tutorial_notebooks(user_id, session)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
# Log the error but continue to return existing notebooks
|
# Log the error but continue to return existing notebooks
|
||||||
logger.error(f"Failed to create tutorial notebook for user {user_id}: {e}")
|
logger.error(f"Failed to create tutorial notebook for user {user_id}: {e}")
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
# Use Cognee to Build Your Own Knowledge Graph
|
||||||
|
|
||||||
|
Cognee is a tool that allows you to build your own knowledge graph from the data you have.
|
||||||
10
cognee/modules/notebooks/tutorials/cognee-basics/cell-2.md
Normal file
10
cognee/modules/notebooks/tutorials/cognee-basics/cell-2.md
Normal file
|
|
@ -0,0 +1,10 @@
|
||||||
|
# What You'll Learn in This Tutorial
|
||||||
|
|
||||||
|
In this tutorial, you'll learn how to use Cognee to transform scattered data into an intelligent knowledge system that enhances your workflow.
|
||||||
|
By the end, you'll have:
|
||||||
|
|
||||||
|
- Connected disparate data sources into a unified AI memory graph
|
||||||
|
- Built a memory layer that infers knowledge from provided data
|
||||||
|
- Learn how to use search capabilities that combine the diverse context
|
||||||
|
|
||||||
|
This tutorial demonstrates the power of knowledge graphs and retrieval-augmented generation (RAG), showing you how to build systems that learn from data and infer knowledge.
|
||||||
|
|
@ -0,0 +1,7 @@
|
||||||
|
# Cognee and Its Core Operations
|
||||||
|
|
||||||
|
Before we dive in, let's understand the core Cognee operations we'll be working with:
|
||||||
|
|
||||||
|
- `cognee.add()` - Ingests raw data into the system
|
||||||
|
- `cognee.cognify()` - Processes and structures data into a knowledge graph using AI
|
||||||
|
- `cognee.search()` - Queries the knowledge graph with natural language
|
||||||
28
cognee/modules/notebooks/tutorials/cognee-basics/cell-4.py
Normal file
28
cognee/modules/notebooks/tutorials/cognee-basics/cell-4.py
Normal file
|
|
@ -0,0 +1,28 @@
|
||||||
|
# Add data one by one, or pass a list to add multiple items at once
|
||||||
|
|
||||||
|
await cognee.add(
|
||||||
|
"Harry Potter is a student at Hogwarts and belongs to Gryffindor house. \
|
||||||
|
He is known for defeating Voldemort and his Patronus is a stag.",
|
||||||
|
dataset_name="cognee-basics",
|
||||||
|
)
|
||||||
|
|
||||||
|
await cognee.add(
|
||||||
|
"Hermione Granger is a student at Hogwarts and also belongs to Gryffindor house. \
|
||||||
|
She is known for her intelligence and deep knowledge of spells. Her Patronus is an otter.",
|
||||||
|
dataset_name="cognee-basics",
|
||||||
|
)
|
||||||
|
|
||||||
|
await cognee.add(
|
||||||
|
"Severus Snape is a professor at Hogwarts who teaches Potions. \
|
||||||
|
He belongs to Slytherin house and was secretly loyal to Albus Dumbledore.",
|
||||||
|
dataset_name="cognee-basics",
|
||||||
|
)
|
||||||
|
|
||||||
|
await cognee.add(
|
||||||
|
[
|
||||||
|
"Hogwarts is a magical school located in Scotland. During Harry Potter's time at school, the headmaster was Albus Dumbledore.",
|
||||||
|
"A Horcrux is a dark magic object used to store a fragment of a wizard's soul. Voldemort created multiple Horcruxes to achieve immortality.",
|
||||||
|
"The Elder Wand is a powerful wand believed to be unbeatable. Its final known owner was Harry Potter.",
|
||||||
|
],
|
||||||
|
dataset_name="cognee-basics",
|
||||||
|
)
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
# Run cognify to process the data and create a knowledge graph
|
||||||
|
|
||||||
|
await cognee.cognify(datasets=["cognee-basics"])
|
||||||
|
|
@ -0,0 +1,9 @@
|
||||||
|
# And after the cognification, search the knowledge graph
|
||||||
|
|
||||||
|
result = await cognee.search(
|
||||||
|
"Which characters belong to Gryffindor?",
|
||||||
|
datasets=["cognee-basics"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Print the result so you can see it in the notebook output
|
||||||
|
print(result)
|
||||||
17
cognee/modules/notebooks/tutorials/cognee-basics/cell-7.py
Normal file
17
cognee/modules/notebooks/tutorials/cognee-basics/cell-7.py
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
# Run multiple searches and print the results
|
||||||
|
|
||||||
|
result_1 = await cognee.search(
|
||||||
|
"Who taught Potions at Hogwarts at time Albus Dumbledore was the headmaster?",
|
||||||
|
datasets=["cognee-basics"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Print the result so you can see it in the notebook output
|
||||||
|
print(result_1)
|
||||||
|
|
||||||
|
|
||||||
|
result_2 = await cognee.search(
|
||||||
|
"How to defeat Voldemort?",
|
||||||
|
datasets=["cognee-basics"],
|
||||||
|
)
|
||||||
|
|
||||||
|
print(result_2)
|
||||||
|
|
@ -0,0 +1,4 @@
|
||||||
|
{
|
||||||
|
"name": "Cognee Basics - tutorial 🧠",
|
||||||
|
"deletable": false
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
# Using Cognee with Python Development Data
|
||||||
|
|
||||||
|
Unite authoritative Python practice (Guido van Rossum's own contributions!), normative guidance (Zen/PEP 8), and your lived context (rules + conversations) into one AI memory that produces answers that are relevant, explainable, and consistent.
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
You'll see that cognee has connected your Python development challenges with Guido's approaches, revealing patterns like:
|
||||||
|
- "Type hint implementation failed due to circular imports - similar to issue Guido solved in mypy PR #1234"
|
||||||
|
- "Performance bottleneck in list comprehension matches pattern Guido optimized in CPython commit abc123"
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue