This commit simplifies the state update logic in the KnowledgeSourcesPage component by replacing multiple conditional assignments with concise if statements. Additionally, it removes unused GoogleDriveFile and OneDriveFile interfaces, enhancing code clarity and maintainability in line with best practices for async development.
♻️ (agent.py): refactor async_response, async_langflow, async_chat, async_langflow_chat, and async_langflow_chat_stream functions to return full response object for function calls
🔧 (chat_service.py): update ChatService to include function call data in message_data if present
This commit simplifies the state update logic in the KnowledgeSourcesPage component by using conditional chaining to set various settings from the backend response. It removes the unnecessary settingsLoaded state, streamlining the code for better readability and maintainability while adhering to robust coding practices.
This commit refactors the KnowledgeSourcesPage component to include a new ingestion settings section, allowing users to configure document processing parameters such as chunk size and overlap. It also improves the connector management interface by integrating async fetching of connector statuses and enhancing error handling. The changes aim to provide a more robust and user-friendly experience while maintaining well-documented code practices.
🚀 (frontend): Implement support for process.env.PORT to run app on a configurable port
🔧 (frontend): Change port variable case from lowercase 'port' to uppercase 'PORT' for better semantics
📝 (frontend): Add comments to clarify the purpose of loading conversation data only when user explicitly selects a conversation
📝 (frontend): Add comments to explain the logic for loading conversation data based on certain conditions
📝 (frontend): Add comments to describe the purpose of handling new conversation creation and resetting messages
📝 (frontend): Add comments to explain the logic for loading conversation data when conversationData changes
📝 (frontend): Add comments to clarify the purpose of loading conversations from the backend
📝 (frontend): Add comments to describe the logic for silent refresh to update data without loading states
📝 (frontend): Add comments to explain the purpose of starting a new conversation and creating a placeholder conversation
📝 (frontend): Add comments to clarify the logic for forking from a response and starting a new conversation
📝 (frontend): Add comments to describe the purpose of adding a conversation document and clearing conversation documents
📝 (frontend): Add comments to explain the logic for using a timeout to debounce multiple rapid refresh calls
📝 (frontend): Add comments to clarify the purpose of cleaning up timeout on unmount
📝 (frontend): Add comments to describe the logic for handling new conversation creation and resetting state
📝 (frontend): Add comments to explain the logic for forking from a response and starting a new conversation
📝 (frontend): Add comments to clarify the purpose of using useMemo for optimizing performance in ChatProvider
📝 (frontend): Add comments to describe the logic for using useMemo in the ChatProvider component
📝 (frontend): Add comments to explain the purpose of the useChat custom hook
📝 (frontend): Add comments to clarify the error message when useChat is not used within a ChatProvider
📝 (services): Update ChatService to fetch Langflow history with flow_id parameter for better control
This commit enhances the KnowledgeDropdown component by integrating a complete file handling process that includes uploading files to Langflow, running an ingestion flow, and deleting the uploaded files. It introduces error handling for each step and dispatches appropriate events to notify the UI of the upload and ingestion results. These changes improve the robustness and maintainability of the component, contributing to a well-documented codebase.
This commit introduces state management for ingest flow IDs and corresponding edit URLs in the KnowledgeSourcesPage component. It enhances the user interface by adding a new section for file ingestion, allowing users to customize their file processing pipeline. The changes improve the overall functionality and maintainability of the settings page, contributing to a more robust and well-documented codebase.