Switches Docker Compose services to local builds for backend, frontend, and langflow. Updates embedding model component to support IBM watsonx.ai features, including input token truncation and original text output, adds new dependencies, and improves configuration options in ingestion and agent flows.
Introduced new environment variables for Anthropic, WatsonX, and Ollama API integration in both backend and Langflow services within docker-compose files. Also enabled backend service to build from local Dockerfile instead of using only the image.
* Added flows with new components
* commented model provider assignment
* Added agent component display name
* commented provider assignment, assign provider on the generic component, assign custom values
* fixed ollama not showing loading steps, fixed loading steps never being removed
* made embedding and llm model optional on onboarding call
* added isEmbedding handling on useModelSelection
* added isEmbedding on onboarding card, separating embedding from non embedding card
* Added one additional step to configure embeddings
* Added embedding provider config
* Changed settings.py to return if not embedding
* Added editing fields to onboarding
* updated onboarding and flows_service to change embedding and llm separately
* updated templates that needs to be changed with provider values
* updated flows with new components
* Changed config manager to not have default models
* Changed flows_service settings
* Complete steps if not embedding
* Add more onboarding steps
* Removed one step from llm steps
* Added Anthropic as a model for the language model on the frontend
* Added anthropic models
* Added anthropic support on Backend
* Fixed provider health and validation
* Format settings
* Change anthropic logo
* Changed button to not jump
* Changed flows service to make anthropic work
* Fixed some things
* add embedding specific global variables
* updated flows
* fixed ingestion flow
* Implemented anthropic on settings page
* add embedding provider logo
* updated backend to work with multiple provider config
* update useUpdateSettings with new settings type
* updated provider health banner to check for health with new api
* changed queries and mutations to use new api
* changed embedding model input to work with new api
* Implemented provider based config on the frontend
* update existing design
* fixed settings configured
* fixed provider health query to include health check for both the providers
* Changed model-providers to show correctly the configured providers
* Updated prompt
* updated openrag agent
* Fixed settings to allow editing providers and changing llm and embedding models
* updated settings
* changed lf ver
* bump openrag version
* added more steps
* update settings to create the global variables
* updated steps
* updated default prompt
---------
Co-authored-by: Sebastián Estévez <estevezsebastian@gmail.com>
Introduces the CONNECTOR_TYPE_URL environment variable to docker-compose files and assets, updates the OpenRAG URL ingestion flow to use it, and ensures it is set in the auth service global variables. This enables explicit configuration and handling of URL-based connectors in the OpenRAG system.
Added new environment variables (FILENAME, MIMETYPE, FILESIZE) and updated LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT to include them. Comment formatting and container_name lines were adjusted for consistency, and DEFAULT_FOLDER_NAME is now commented out in both files.
* hard-coded openai models
* ensure index if disable ingest with langflow is active
* update backend to not update embedding model when flag is disabled
* initialize index on startup when feature flag is enabled
* put config.yaml on docker compose
Added OWNER, OWNER_NAME, OWNER_EMAIL, and CONNECTOR_TYPE environment variables to docker-compose.yml. Updated LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT to match. Changed header keys in langflow_file_service.py to uppercase and ensured values are stringified for consistency.
Switched OpenRAG backend and frontend in docker-compose.yml to use local Dockerfile builds instead of remote images. Updated environment variables for better clarity and system integration. In flows/openrag_agent.json and langflow_file_service, improved handling of docs_metadata to support Data objects and added logging for metadata ingestion. Added agent_llm edge to agent node in flow definition.
Added OWNER, OWNER_NAME, OWNER_EMAIL, and CONNECTOR_TYPE environment variables to docker-compose.yml and updated LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT accordingly. Modified ingestion_flow.json to adjust node selection and className, and cleared a sensitive value. Added logging for metadata tweaks in langflow_file_service.py for better traceability.
This commit modifies both docker-compose.yml and docker-compose-cpu.yml to update the environment variable from FLOW_ID to LANGFLOW_CHAT_FLOW_ID, ensuring consistency across configurations. These changes contribute to a more robust and well-documented codebase.
This commit modifies both docker-compose.yml and docker-compose-cpu.yml to add OPENSEARCH_PASSWORD to the LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT. Additionally, unnecessary whitespace has been removed for improved readability.