* Initial plan
* Implement dynamic Ollama embedding dimension resolution with probing
Co-authored-by: phact <1313220+phact@users.noreply.github.com>
* Fix Ollama probing
* raise instead of dims 0
* Show better error
* Run embedding probe before saving settings so that user can update
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: phact <1313220+phact@users.noreply.github.com>
Co-authored-by: Lucas Oliveira <lucas.edu.oli@hotmail.com>
Co-authored-by: phact <estevezsebastian@gmail.com>
* hard-coded openai models
* ensure index if disable ingest with langflow is active
* update backend to not update embedding model when flag is disabled
* initialize index on startup when feature flag is enabled
* put config.yaml on docker compose
* changed tooltip stype
* added start on label wrapper
* changed switch to checkbox on openai onboarding and changed copies
* made border be red when api key is invalid
* Added embedding configuration after onboarding
* changed openrag ingest docling to have same embedding model component as other flows
* changed flows service to get flow by id, not by path
* modify reset_langflow to also put right embedding model
* added endpoint and project id to provider config
* added replacing the model with the provider model when resetting
* Moved consts to settings.py
* raise when flow_id is not found
Switched OpenRAG backend and frontend in docker-compose.yml to use local Dockerfile builds instead of remote images. Updated environment variables for better clarity and system integration. In flows/openrag_agent.json and langflow_file_service, improved handling of docs_metadata to support Data objects and added logging for metadata ingestion. Added agent_llm edge to agent node in flow definition.
* implement delete user conversation on agent
* format
* implement delete session endpoint
* implement delete session on persistence services
* added deletion of sessions and added fetch sessions with query instead of with useEffect
* removed unused texts
* implemented dropdown menu on conversations
* fixed logos
* pre-populate ollama endpoint
* show/hide api keys with button
* Added dropdown selector for ibm endpoint
* Added programatic dot pattern instead of background image
* Updated copies
* wait for 500ms before show connecting
* Changed copy
* removed unused log
* Added padding when password button is present
* made toggle be on mouse up
* show placeholder as loading models when they're loading
* removed description from model selector
* implemented getting key from env
* fixed complete button not updating
Added LangflowMCPService to update MCP servers with the user's JWT header after authentication. AuthService now triggers a background update to MCP servers on successful login, ensuring JWT propagation for downstream services.
- Added a new `DeleteConfirmationDialog` component for confirming deletions.
- Updated `KnowledgeDropdown` to include a loading state and improved user feedback during file operations.
- Enhanced the search page to support bulk deletion of documents with confirmation dialog.
- Integrated event dispatching for knowledge updates after file operations.
- Refactored various components for better readability and maintainability.
This commit updates the service initialization in main.py to replace the existing ConnectorService with LangflowConnectorService. This change enhances the management of connector documents by leveraging the new service's capabilities, aligning with the ongoing improvements in asynchronous processing and code maintainability.
This commit introduces a new combined endpoint for uploading files and running ingestion in Langflow. The frontend component is updated to utilize this endpoint, streamlining the process by eliminating separate upload and ingestion calls. The response structure is adjusted to include deletion status and other relevant information, enhancing error handling and logging practices throughout the codebase.