diff --git a/docs/docs/get-started/install.mdx b/docs/docs/get-started/install.mdx new file mode 100644 index 00000000..67f4ae89 --- /dev/null +++ b/docs/docs/get-started/install.mdx @@ -0,0 +1,179 @@ +--- +title: Install OpenRAG +slug: /install +--- + +import Tabs from '@theme/Tabs'; +import TabItem from '@theme/TabItem'; + +OpenRAG can be installed in multiple ways: + +* [**Python wheel**](#install-python-wheel): Install the OpenRAG Python wheel and use the [OpenRAG Terminal User Interface (TUI)](/get-started/tui) to install, run, and configure your OpenRAG deployment without running Docker commands. + +* [**Docker Compose**](#install-and-run-docker): Clone the OpenRAG repository and deploy OpenRAG with Docker Compose, including all services and dependencies. + +## Prerequisites + +- [Python Version 3.10 to 3.13](https://www.python.org/downloads/release/python-3100/) +- [uv](https://docs.astral.sh/uv/getting-started/installation/) +- [Docker](https://docs.docker.com/get-docker/) or [Podman](https://podman.io/docs/installation) installed +- [Docker Compose](https://docs.docker.com/compose/install/) installed. If using Podman, use [podman-compose](https://docs.podman.io/en/latest/markdown/podman-compose.1.html) or alias Docker compose commands to Podman commands. +- For GPU support: (TBD) + +## Python wheel {#install-python-wheel} + +The Python wheel is currently available internally, but will be available on PyPI at launch. +The wheel installs the OpenRAG wheel, which includes the TUI for installing, running, and managing OpenRAG. +For more information on virtual environments, see [uv](https://docs.astral.sh/uv/pip/environments). + +1. Create a new project with a virtual environment using [uv](https://docs.astral.sh/uv/pip/environments). + + ```bash + uv init YOUR_PROJECT_NAME + cd YOUR_PROJECT_NAME + ``` +2. Add the OpenRAG wheel to your project and install it in the virtual environment. + Replace `PATH/TO/` and `VERSION` with your OpenRAG wheel location and version. + ```bash + uv add PATH/TO/openrag-VERSION-py3-none-any.whl + ``` +3. Ensure all dependencies are installed and updated in your virtual environment. + ```bash + uv sync + ``` + +4. Start the OpenRAG TUI. + ```bash + uv run openrag + ``` + + The OpenRAG TUI opens. + +5. To install OpenRAG with Basic Setup, click **Basic Setup** or press 1. Basic Setup does not set up OAuth connections for ingestion from Google Drive, OneDrive, or AWS. For OAuth setup, see [Advanced Setup](#advanced-setup). + The TUI prompts you for the required startup values. + Click **Generate Passwords** to autocomplete fields that contain **Auto-generated Secure Password**, or bring your own passwords. +
+ Where do I find the required startup values? + + | Variable | Where to Find | Description | + |----------|---------------|-------------| + | `OPENSEARCH_PASSWORD` | Auto-generated secure password | The password for OpenSearch database access. Must be at least 8 characters and must contain at least one uppercase letter, one lowercase letter, one digit, and one special character. | + | `OPENAI_API_KEY` | [OpenAI Platform](https://platform.openai.com/api-keys) | API key from your OpenAI account. | + | `LANGFLOW_SUPERUSER` | User generated | Username for Langflow admin access. For more, see [Langflow docs](https://docs.langflow.org/api-keys-and-authentication#langflow-superuser). | + | `LANGFLOW_SUPERUSER_PASSWORD` | Auto-generated secure password | Password for Langflow admin access. For more, see the [Langflow docs](https://docs.langflow.org/api-keys-and-authentication#langflow-superuser). | + | `LANGFLOW_SECRET_KEY` | Auto-generated secure key | Secret key for Langflow security. For more, see the [Langflow docs](https://docs.langflow.org/api-keys-and-authentication#langflow-secret-key). | + | `LANGFLOW_AUTO_LOGIN` | Auto-generated or manual | Auto-login configuration. For more, see the [Langflow docs](https://docs.langflow.org/api-keys-and-authentication#langflow-auto-login). | + | `LANGFLOW_NEW_USER_IS_ACTIVE` | Langflow | New user activation setting. For more, see the [Langflow docs](https://docs.langflow.org/api-keys-and-authentication#langflow-new-user-is-active). | + | `LANGFLOW_ENABLE_SUPERUSER_CLI` | Langflow server | Superuser CLI access setting. For more, see the [Langflow docs](https://docs.langflow.org/api-keys-and-authentication#langflow-enable-superuser-cli). | + | `DOCUMENTS_PATH` | Set your local path | Path to your document storage directory. | + +
+ + To complete credentials, click **Save Configuration**. + +6. To start OpenRAG with your credentials, click **Start Container Services**. + Startup pulls container images and starts them, so it can take some time. + The operation has completed when the **Close** button is available and the terminal displays: + ```bash + Services started successfully + Command completed successfully + ``` + +7. To open the OpenRAG application, click **Open App** or press 6. +8. Continue with the Quickstart. + +### Advanced Setup {#advanced-setup} + +**Advanced Setup** includes the required values from **Basic Setup**, with additional settings for OAuth credentials. +If the OpenRAG TUI detects OAuth credentials, it enforces the Advanced Setup path. +1. Add your client and secret values for Google, Azure, or AWS OAuth. +These values can be found in your OAuth provider. +2. The OpenRAG TUI presents redirect URIs for your OAuth app. +These are the URLs your OAuth provider will redirect back to after user sign-in. +Register these redirect values with your OAuth provider as they are presented in the TUI. +3. To open the OpenRAG application, click **Open App** or press 6. +You will be presented with your provider's OAuth sign-in screen, and be redirected to the redirect URI after sign-in. + +Two additional variables are available for Advanced Setup: + +The `LANGFLOW_PUBLIC_URL` controls where the Langflow web interface can be accessed. This is where users interact with their flows in a browser. + +The `WEBHOOK_BASE_URL` controls where the endpoint for `/connectors/CONNECTOR_TYPE/webhook` will be available. +This connection enables real-time document synchronization with external services. +For example, for Google Drive file synchronization the webhook URL is `/connectors/google_drive/webhook`. + +## Docker {#install-and-run-docker} + +There are two different Docker Compose files. +They deploy the same applications and containers, but to different environments. + +- [`docker-compose.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose.yml) is an OpenRAG deployment with GPU support for accelerated AI processing. + +- [`docker-compose-cpu.yml`](https://github.com/langflow-ai/openrag/blob/main/docker-compose-cpu.yml) is a CPU-only version of OpenRAG for systems without GPU support. Use this Docker compose file for environments where GPU drivers aren't available. + +To install OpenRAG with Docker Compose: + +1. Clone the OpenRAG repository. + ```bash + git clone https://github.com/langflow-ai/openrag.git + cd openrag + ``` + +2. Copy the example `.env` file that is included in the repository root. + The example file includes all environment variables with comments to guide you in finding and setting their values. + ```bash + cp .env.example .env + ``` + + Alternatively, create a new `.env` file in the repository root. + ``` + touch .env + ``` + +3. Set environment variables. The Docker Compose files are populated with values from your `.env`, so the following values are **required** to be set: + + ```bash + OPENSEARCH_PASSWORD=your_secure_password + OPENAI_API_KEY=your_openai_api_key + + LANGFLOW_SUPERUSER=admin + LANGFLOW_SUPERUSER_PASSWORD=your_langflow_password + LANGFLOW_SECRET_KEY=your_secret_key + ``` + For more information on configuring OpenRAG with environment variables, see [Environment variables](/configure/configuration). + For additional configuration values, including `config.yaml`, see [Configuration](/configure/configuration). + +4. Deploy OpenRAG with Docker Compose based on your deployment type. + + For GPU-enabled systems, run the following command: + ```bash + docker compose up -d + ``` + + For CPU-only systems, run the following command: + ```bash + docker compose -f docker-compose-cpu.yml up -d + ``` + + The OpenRAG Docker Compose file starts five containers: + | Container Name | Default Address | Purpose | + |---|---|---| + | OpenRAG Backend | http://localhost:8000 | FastAPI server and core functionality. | + | OpenRAG Frontend | http://localhost:3000 | React web interface for users. | + | Langflow | http://localhost:7860 | AI workflow engine and flow management. | + | OpenSearch | http://localhost:9200 | Vector database for document storage. | + | OpenSearch Dashboards | http://localhost:5601 | Database administration interface. | + +5. Verify installation by confirming all services are running. + + ```bash + docker compose ps + ``` + + You can now access the application at: + + - **Frontend**: http://localhost:3000 + - **Backend API**: http://localhost:8000 + - **Langflow**: http://localhost:7860 + +Continue with the Quickstart. \ No newline at end of file diff --git a/docs/sidebars.js b/docs/sidebars.js index a0e7103a..21cfe52f 100644 --- a/docs/sidebars.js +++ b/docs/sidebars.js @@ -23,14 +23,18 @@ const sidebars = { { type: "doc", id: "get-started/what-is-openrag", - label: "Introduction" + label: "About OpenRAG" + }, + { + type: "doc", + id: "get-started/install", + label: "Installation" }, { type: "doc", id: "get-started/quickstart", label: "Quickstart" }, - { type: "doc", id: "get-started/docker", diff --git a/frontend/components.json b/frontend/components.json index 8e7f1638..53d2101e 100644 --- a/frontend/components.json +++ b/frontend/components.json @@ -10,9 +10,13 @@ "cssVariables": true, "prefix": "" }, + "iconLibrary": "lucide", "aliases": { "components": "components", "utils": "lib/utils", "ui": "components/ui" + }, + "registries": { + "@magicui": "https://magicui.design/r/{name}.json" } -} \ No newline at end of file +} diff --git a/frontend/components/delete-session-modal.tsx b/frontend/components/delete-session-modal.tsx new file mode 100644 index 00000000..7b57a44f --- /dev/null +++ b/frontend/components/delete-session-modal.tsx @@ -0,0 +1,58 @@ +"use client"; + +import { AlertTriangle } from "lucide-react"; +import { Button } from "@/components/ui/button"; +import { + Dialog, + DialogContent, + DialogDescription, + DialogFooter, + DialogHeader, + DialogTitle, +} from "@/components/ui/dialog"; + +interface DeleteSessionModalProps { + isOpen: boolean; + onClose: () => void; + onConfirm: () => void; + sessionTitle: string; + isDeleting?: boolean; +} + +export function DeleteSessionModal({ + isOpen, + onClose, + onConfirm, + sessionTitle, + isDeleting = false, +}: DeleteSessionModalProps) { + return ( + + + + + + Delete Conversation + + + Are you sure you want to delete "{sessionTitle}"? This + action cannot be undone and will permanently remove the conversation + and all its messages. + + + + + + + + + ); +} diff --git a/frontend/components/logo/ibm-logo.tsx b/frontend/components/logo/ibm-logo.tsx index 6f7fc2cd..44b6e08c 100644 --- a/frontend/components/logo/ibm-logo.tsx +++ b/frontend/components/logo/ibm-logo.tsx @@ -11,7 +11,7 @@ export default function IBMLogo(props: React.SVGProps) { IBM Logo ); diff --git a/frontend/components/logo/openai-logo.tsx b/frontend/components/logo/openai-logo.tsx index 639c130e..330211b9 100644 --- a/frontend/components/logo/openai-logo.tsx +++ b/frontend/components/logo/openai-logo.tsx @@ -23,7 +23,7 @@ export default function OpenAILogo(props: React.SVGProps) { diff --git a/frontend/components/navigation-layout.tsx b/frontend/components/navigation-layout.tsx index fae8da62..d7a564a7 100644 --- a/frontend/components/navigation-layout.tsx +++ b/frontend/components/navigation-layout.tsx @@ -1,8 +1,12 @@ -"use client" +"use client"; -import { Navigation } from "@/components/navigation"; -import { ModeToggle } from "@/components/mode-toggle"; +import { usePathname } from "next/navigation"; +import { useGetConversationsQuery } from "@/app/api/queries/useGetConversationsQuery"; import { KnowledgeFilterDropdown } from "@/components/knowledge-filter-dropdown"; +import { ModeToggle } from "@/components/mode-toggle"; +import { Navigation } from "@/components/navigation"; +import { useAuth } from "@/contexts/auth-context"; +import { useChat } from "@/contexts/chat-context"; import { useKnowledgeFilter } from "@/contexts/knowledge-filter-context"; interface NavigationLayoutProps { @@ -11,11 +15,35 @@ interface NavigationLayoutProps { export function NavigationLayout({ children }: NavigationLayoutProps) { const { selectedFilter, setSelectedFilter } = useKnowledgeFilter(); - + const pathname = usePathname(); + const { isAuthenticated, isNoAuthMode } = useAuth(); + const { + endpoint, + refreshTrigger, + refreshConversations, + startNewConversation, + } = useChat(); + + // Only fetch conversations on chat page + const isOnChatPage = pathname === "/" || pathname === "/chat"; + const { data: conversations = [], isLoading: isConversationsLoading } = + useGetConversationsQuery(endpoint, refreshTrigger, { + enabled: isOnChatPage && (isAuthenticated || isNoAuthMode), + }); + + const handleNewConversation = () => { + refreshConversations(); + startNewConversation(); + }; + return (
- +
@@ -31,7 +59,7 @@ export function NavigationLayout({ children }: NavigationLayoutProps) { {/* Search component could go here */}
-
- {children} -
+
{children}
); -} \ No newline at end of file +} diff --git a/frontend/components/navigation.tsx b/frontend/components/navigation.tsx index b651ef6a..339b7d22 100644 --- a/frontend/components/navigation.tsx +++ b/frontend/components/navigation.tsx @@ -1,24 +1,35 @@ "use client"; -import { useChat } from "@/contexts/chat-context"; -import { cn } from "@/lib/utils"; import { + EllipsisVertical, FileText, Library, MessageSquare, + MoreHorizontal, Plus, Settings2, + Trash2, } from "lucide-react"; import Link from "next/link"; import { usePathname } from "next/navigation"; -import { useCallback, useEffect, useRef, useState } from "react"; - -import { EndpointType } from "@/contexts/chat-context"; -import { useLoadingStore } from "@/stores/loadingStore"; -import { KnowledgeFilterList } from "./knowledge-filter-list"; +import { useEffect, useRef, useState } from "react"; +import { toast } from "sonner"; +import { useDeleteSessionMutation } from "@/app/api/queries/useDeleteSessionMutation"; +import { + DropdownMenu, + DropdownMenuContent, + DropdownMenuItem, + DropdownMenuTrigger, +} from "@/components/ui/dropdown-menu"; +import { type EndpointType, useChat } from "@/contexts/chat-context"; import { useKnowledgeFilter } from "@/contexts/knowledge-filter-context"; +import { cn } from "@/lib/utils"; +import { useLoadingStore } from "@/stores/loadingStore"; +import { DeleteSessionModal } from "./delete-session-modal"; +import { KnowledgeFilterList } from "./knowledge-filter-list"; -interface RawConversation { +// Re-export the types for backward compatibility +export interface RawConversation { response_id: string; title: string; endpoint: string; @@ -35,7 +46,7 @@ interface RawConversation { [key: string]: unknown; } -interface ChatConversation { +export interface ChatConversation { response_id: string; title: string; endpoint: EndpointType; @@ -52,11 +63,20 @@ interface ChatConversation { [key: string]: unknown; } -export function Navigation() { +interface NavigationProps { + conversations?: ChatConversation[]; + isConversationsLoading?: boolean; + onNewConversation?: () => void; +} + +export function Navigation({ + conversations = [], + isConversationsLoading = false, + onNewConversation, +}: NavigationProps = {}) { const pathname = usePathname(); const { endpoint, - refreshTrigger, loadConversation, currentConversationId, setCurrentConversationId, @@ -70,18 +90,64 @@ export function Navigation() { const { loading } = useLoadingStore(); - const [conversations, setConversations] = useState([]); - const [loadingConversations, setLoadingConversations] = useState(false); const [loadingNewConversation, setLoadingNewConversation] = useState(false); const [previousConversationCount, setPreviousConversationCount] = useState(0); + const [deleteModalOpen, setDeleteModalOpen] = useState(false); + const [conversationToDelete, setConversationToDelete] = + useState(null); const fileInputRef = useRef(null); const { selectedFilter, setSelectedFilter } = useKnowledgeFilter(); + // Delete session mutation + const deleteSessionMutation = useDeleteSessionMutation({ + onSuccess: () => { + toast.success("Conversation deleted successfully"); + + // If we deleted the current conversation, select another one + if ( + conversationToDelete && + currentConversationId === conversationToDelete.response_id + ) { + // Filter out the deleted conversation and find the next one + const remainingConversations = conversations.filter( + (conv) => conv.response_id !== conversationToDelete.response_id, + ); + + if (remainingConversations.length > 0) { + // Load the first available conversation (most recent) + loadConversation(remainingConversations[0]); + } else { + // No conversations left, start a new one + setCurrentConversationId(null); + if (onNewConversation) { + onNewConversation(); + } else { + refreshConversations(); + startNewConversation(); + } + } + } + + setDeleteModalOpen(false); + setConversationToDelete(null); + }, + onError: (error) => { + toast.error(`Failed to delete conversation: ${error.message}`); + }, + }); + const handleNewConversation = () => { setLoadingNewConversation(true); - refreshConversations(); - startNewConversation(); + + // Use the prop callback if provided, otherwise use the context method + if (onNewConversation) { + onNewConversation(); + } else { + refreshConversations(); + startNewConversation(); + } + if (typeof window !== "undefined") { window.dispatchEvent(new CustomEvent("newConversation")); } @@ -98,7 +164,7 @@ export function Navigation() { window.dispatchEvent( new CustomEvent("fileUploadStart", { detail: { filename: file.name }, - }) + }), ); try { @@ -122,7 +188,7 @@ export function Navigation() { filename: file.name, error: "Failed to process document", }, - }) + }), ); // Trigger loading end event @@ -142,7 +208,7 @@ export function Navigation() { window.dispatchEvent( new CustomEvent("fileUploaded", { detail: { file, result }, - }) + }), ); // Trigger loading end event @@ -156,7 +222,7 @@ export function Navigation() { window.dispatchEvent( new CustomEvent("fileUploadError", { detail: { filename: file.name, error: "Failed to process document" }, - }) + }), ); } }; @@ -176,6 +242,41 @@ export function Navigation() { } }; + const handleDeleteConversation = ( + conversation: ChatConversation, + event?: React.MouseEvent, + ) => { + if (event) { + event.preventDefault(); + event.stopPropagation(); + } + setConversationToDelete(conversation); + setDeleteModalOpen(true); + }; + + const handleContextMenuAction = ( + action: string, + conversation: ChatConversation, + ) => { + switch (action) { + case "delete": + handleDeleteConversation(conversation); + break; + // Add more actions here in the future (rename, duplicate, etc.) + default: + break; + } + }; + + const confirmDeleteConversation = () => { + if (conversationToDelete) { + deleteSessionMutation.mutate({ + sessionId: conversationToDelete.response_id, + endpoint: endpoint, + }); + } + }; + const routes = [ { label: "Chat", @@ -200,91 +301,6 @@ export function Navigation() { const isOnChatPage = pathname === "/" || pathname === "/chat"; const isOnKnowledgePage = pathname.startsWith("/knowledge"); - const createDefaultPlaceholder = useCallback(() => { - return { - response_id: "new-conversation-" + Date.now(), - title: "New conversation", - endpoint: endpoint, - messages: [ - { - role: "assistant", - content: "How can I assist?", - timestamp: new Date().toISOString(), - }, - ], - created_at: new Date().toISOString(), - last_activity: new Date().toISOString(), - total_messages: 1, - } as ChatConversation; - }, [endpoint]); - - const fetchConversations = useCallback(async () => { - setLoadingConversations(true); - try { - // Fetch from the selected endpoint only - const apiEndpoint = - endpoint === "chat" ? "/api/chat/history" : "/api/langflow/history"; - - const response = await fetch(apiEndpoint); - if (response.ok) { - const history = await response.json(); - const rawConversations = history.conversations || []; - - // Cast conversations to proper type and ensure endpoint is correct - const conversations: ChatConversation[] = rawConversations.map( - (conv: RawConversation) => ({ - ...conv, - endpoint: conv.endpoint as EndpointType, - }) - ); - - // Sort conversations by last activity (most recent first) - conversations.sort((a: ChatConversation, b: ChatConversation) => { - const aTime = new Date( - a.last_activity || a.created_at || 0 - ).getTime(); - const bTime = new Date( - b.last_activity || b.created_at || 0 - ).getTime(); - return bTime - aTime; - }); - - setConversations(conversations); - - // If no conversations exist and no placeholder is shown, create a default placeholder - if (conversations.length === 0 && !placeholderConversation) { - setPlaceholderConversation(createDefaultPlaceholder()); - } - } else { - setConversations([]); - - // Also create placeholder when request fails and no conversations exist - if (!placeholderConversation) { - setPlaceholderConversation(createDefaultPlaceholder()); - } - } - - // Conversation documents are now managed in chat context - } catch (error) { - console.error(`Failed to fetch ${endpoint} conversations:`, error); - setConversations([]); - } finally { - setLoadingConversations(false); - } - }, [ - endpoint, - placeholderConversation, - setPlaceholderConversation, - createDefaultPlaceholder, - ]); - - // Fetch chat conversations when on chat page, endpoint changes, or refresh is triggered - useEffect(() => { - if (isOnChatPage) { - fetchConversations(); - } - }, [isOnChatPage, endpoint, refreshTrigger, fetchConversations]); - // Clear placeholder when conversation count increases (new conversation was created) useEffect(() => { const currentCount = conversations.length; @@ -326,7 +342,7 @@ export function Navigation() { "text-sm group flex p-3 w-full justify-start font-medium cursor-pointer hover:bg-accent hover:text-accent-foreground rounded-lg transition-all", route.active ? "bg-accent text-accent-foreground shadow-sm" - : "text-foreground hover:text-accent-foreground" + : "text-foreground hover:text-accent-foreground", )} >
@@ -335,7 +351,7 @@ export function Navigation() { "h-4 w-4 mr-3 shrink-0", route.active ? "text-accent-foreground" - : "text-muted-foreground group-hover:text-foreground" + : "text-muted-foreground group-hover:text-foreground", )} /> {route.label} @@ -366,6 +382,7 @@ export function Navigation() { Conversations )} {/* Show regular conversations */} @@ -412,9 +430,10 @@ export function Navigation() {
) : ( conversations.map((conversation) => ( -
-
- {conversation.title} -
-
- {conversation.total_messages} messages -
- {conversation.last_activity && ( -
- {new Date( - conversation.last_activity - ).toLocaleDateString()} +
+
+
+ {conversation.title} +
- )} -
+ + + + + e.stopPropagation()} + > + { + e.stopPropagation(); + handleContextMenuAction( + "delete", + conversation, + ); + }} + className="cursor-pointer text-destructive focus:text-destructive" + > + + Delete conversation + + + +
+ )) )} @@ -456,6 +507,7 @@ export function Navigation() { Conversation knowledge
)} + + {/* Delete Session Modal */} + { + setDeleteModalOpen(false); + setConversationToDelete(null); + }} + onConfirm={confirmDeleteConversation} + sessionTitle={conversationToDelete?.title || ""} + isDeleting={deleteSessionMutation.isPending} + /> ); } diff --git a/frontend/components/ui/button.tsx b/frontend/components/ui/button.tsx index b9a83922..aff33335 100644 --- a/frontend/components/ui/button.tsx +++ b/frontend/components/ui/button.tsx @@ -14,8 +14,7 @@ const buttonVariants = cva( "border border-input hover:bg-muted hover:text-accent-foreground disabled:bg-muted disabled:!border-none", primary: "border bg-background text-secondary-foreground hover:bg-muted hover:shadow-sm", - warning: - "bg-warning-foreground text-warning-text hover:bg-warning-foreground/90 hover:shadow-sm", + warning: "bg-warning text-secondary hover:bg-warning/90", secondary: "border border-muted bg-muted text-secondary-foreground hover:bg-secondary-foreground/5", ghost: @@ -39,14 +38,14 @@ const buttonVariants = cva( variant: "default", size: "default", }, - }, + } ); function toTitleCase(text: string) { return text ?.split(" ") ?.map( - (word) => word?.charAt(0)?.toUpperCase() + word?.slice(1)?.toLowerCase(), + (word) => word?.charAt(0)?.toUpperCase() + word?.slice(1)?.toLowerCase() ) ?.join(" "); } @@ -72,7 +71,7 @@ const Button = React.forwardRef( ignoreTitleCase = false, ...props }, - ref, + ref ) => { const Comp = asChild ? Slot : "button"; let newChildren = children; @@ -101,7 +100,7 @@ const Button = React.forwardRef( )} ); - }, + } ); Button.displayName = "Button"; diff --git a/frontend/components/ui/dot-pattern.tsx b/frontend/components/ui/dot-pattern.tsx new file mode 100644 index 00000000..aa4b2028 --- /dev/null +++ b/frontend/components/ui/dot-pattern.tsx @@ -0,0 +1,158 @@ +"use client"; + +import { motion } from "motion/react"; +import type React from "react"; +import { useEffect, useId, useRef, useState } from "react"; +import { cn } from "@/lib/utils"; + +/** + * DotPattern Component Props + * + * @param {number} [width=16] - The horizontal spacing between dots + * @param {number} [height=16] - The vertical spacing between dots + * @param {number} [x=0] - The x-offset of the entire pattern + * @param {number} [y=0] - The y-offset of the entire pattern + * @param {number} [cx=1] - The x-offset of individual dots + * @param {number} [cy=1] - The y-offset of individual dots + * @param {number} [cr=1] - The radius of each dot + * @param {string} [className] - Additional CSS classes to apply to the SVG container + * @param {boolean} [glow=false] - Whether dots should have a glowing animation effect + */ +interface DotPatternProps extends React.SVGProps { + width?: number; + height?: number; + x?: number; + y?: number; + cx?: number; + cy?: number; + cr?: number; + className?: string; + glow?: boolean; + [key: string]: unknown; +} + +/** + * DotPattern Component + * + * A React component that creates an animated or static dot pattern background using SVG. + * The pattern automatically adjusts to fill its container and can optionally display glowing dots. + * + * @component + * + * @see DotPatternProps for the props interface. + * + * @example + * // Basic usage + * + * + * // With glowing effect and custom spacing + * + * + * @notes + * - The component is client-side only ("use client") + * - Automatically responds to container size changes + * - When glow is enabled, dots will animate with random delays and durations + * - Uses Motion for animations + * - Dots color can be controlled via the text color utility classes + */ + +export function DotPattern({ + width = 16, + height = 16, + x = 0, + y = 0, + cx = 1, + cy = 1, + cr = 1, + className, + glow = false, + ...props +}: DotPatternProps) { + const id = useId(); + const containerRef = useRef(null); + const [dimensions, setDimensions] = useState({ width: 0, height: 0 }); + + useEffect(() => { + const updateDimensions = () => { + if (containerRef.current) { + const { width, height } = containerRef.current.getBoundingClientRect(); + setDimensions({ width, height }); + } + }; + + updateDimensions(); + window.addEventListener("resize", updateDimensions); + return () => window.removeEventListener("resize", updateDimensions); + }, []); + + const dots = Array.from( + { + length: + Math.ceil(dimensions.width / width) * + Math.ceil(dimensions.height / height), + }, + (_, i) => { + const col = i % Math.ceil(dimensions.width / width); + const row = Math.floor(i / Math.ceil(dimensions.width / width)); + return { + x: col * width + cx, + y: row * height + cy, + delay: Math.random() * 5, + duration: Math.random() * 3 + 2, + }; + }, + ); + + return ( + + ); +} diff --git a/frontend/components/ui/input.tsx b/frontend/components/ui/input.tsx index 5ba0eba0..04599fd0 100644 --- a/frontend/components/ui/input.tsx +++ b/frontend/components/ui/input.tsx @@ -1,3 +1,4 @@ +import { Eye, EyeOff } from "lucide-react"; import * as React from "react"; import { cn } from "@/lib/utils"; @@ -12,6 +13,11 @@ const Input = React.forwardRef( const [hasValue, setHasValue] = React.useState( Boolean(props.value || props.defaultValue), ); + const [showPassword, setShowPassword] = React.useState(false); + + const handleTogglePassword = () => { + setShowPassword(!showPassword); + }; const handleChange = (e: React.ChangeEvent) => { setHasValue(e.target.value.length > 0); @@ -23,8 +29,8 @@ const Input = React.forwardRef( return ( ); - } + }, ); Input.displayName = "Input"; diff --git a/frontend/components/ui/radio-group.tsx b/frontend/components/ui/radio-group.tsx index 0968c2a8..de9da9af 100644 --- a/frontend/components/ui/radio-group.tsx +++ b/frontend/components/ui/radio-group.tsx @@ -1,10 +1,10 @@ -"use client" +"use client"; -import * as React from "react" -import * as RadioGroupPrimitive from "@radix-ui/react-radio-group" -import { Circle } from "lucide-react" +import * as React from "react"; +import * as RadioGroupPrimitive from "@radix-ui/react-radio-group"; +import { Circle } from "lucide-react"; -import { cn } from "@/lib/utils" +import { cn } from "@/lib/utils"; const RadioGroup = React.forwardRef< React.ElementRef, @@ -16,9 +16,9 @@ const RadioGroup = React.forwardRef< {...props} ref={ref} /> - ) -}) -RadioGroup.displayName = RadioGroupPrimitive.Root.displayName + ); +}); +RadioGroup.displayName = RadioGroupPrimitive.Root.displayName; const RadioGroupItem = React.forwardRef< React.ElementRef, @@ -28,7 +28,7 @@ const RadioGroupItem = React.forwardRef< - ) -}) -RadioGroupItem.displayName = RadioGroupPrimitive.Item.displayName + ); +}); +RadioGroupItem.displayName = RadioGroupPrimitive.Item.displayName; -export { RadioGroup, RadioGroupItem } \ No newline at end of file +export { RadioGroup, RadioGroupItem }; diff --git a/frontend/components/ui/select.tsx b/frontend/components/ui/select.tsx index b8e19381..66665060 100644 --- a/frontend/components/ui/select.tsx +++ b/frontend/components/ui/select.tsx @@ -1,42 +1,48 @@ -"use client" +"use client"; -import * as React from "react" -import * as SelectPrimitive from "@radix-ui/react-select" -import { Check, ChevronDown, ChevronUp, Lock } from "lucide-react" +import * as React from "react"; +import * as SelectPrimitive from "@radix-ui/react-select"; +import { + Check, + ChevronDown, + ChevronsUpDown, + ChevronUp, + LockIcon, +} from "lucide-react"; -import { cn } from "@/lib/utils" +import { cn } from "@/lib/utils"; -const Select = SelectPrimitive.Root +const Select = SelectPrimitive.Root; -const SelectGroup = SelectPrimitive.Group +const SelectGroup = SelectPrimitive.Group; -const SelectValue = SelectPrimitive.Value +const SelectValue = SelectPrimitive.Value; const SelectTrigger = React.forwardRef< React.ElementRef, React.ComponentPropsWithoutRef ->(({ className, children, disabled, ...props }, ref) => ( - span]:line-clamp-1", - disabled && "bg-muted", - className - )} - disabled={disabled} - {...props} - > - {children} - - {disabled ? ( - - ) : ( - +>(({ className, children, ...props }, ref) => { + return ( + span]:line-clamp-1", + className )} - - -)) -SelectTrigger.displayName = SelectPrimitive.Trigger.displayName + {...props} + > + {children} + + {props.disabled ? ( + + ) : ( + + )} + + + ); +}); +SelectTrigger.displayName = SelectPrimitive.Trigger.displayName; const SelectScrollUpButton = React.forwardRef< React.ElementRef, @@ -52,8 +58,8 @@ const SelectScrollUpButton = React.forwardRef< > -)) -SelectScrollUpButton.displayName = SelectPrimitive.ScrollUpButton.displayName +)); +SelectScrollUpButton.displayName = SelectPrimitive.ScrollUpButton.displayName; const SelectScrollDownButton = React.forwardRef< React.ElementRef, @@ -69,9 +75,9 @@ const SelectScrollDownButton = React.forwardRef< > -)) +)); SelectScrollDownButton.displayName = - SelectPrimitive.ScrollDownButton.displayName + SelectPrimitive.ScrollDownButton.displayName; const SelectContent = React.forwardRef< React.ElementRef, @@ -102,8 +108,8 @@ const SelectContent = React.forwardRef< -)) -SelectContent.displayName = SelectPrimitive.Content.displayName +)); +SelectContent.displayName = SelectPrimitive.Content.displayName; const SelectLabel = React.forwardRef< React.ElementRef, @@ -114,8 +120,8 @@ const SelectLabel = React.forwardRef< className={cn("py-1.5 pl-8 pr-2 text-sm font-semibold", className)} {...props} /> -)) -SelectLabel.displayName = SelectPrimitive.Label.displayName +)); +SelectLabel.displayName = SelectPrimitive.Label.displayName; const SelectItem = React.forwardRef< React.ElementRef, @@ -137,8 +143,8 @@ const SelectItem = React.forwardRef< {children} -)) -SelectItem.displayName = SelectPrimitive.Item.displayName +)); +SelectItem.displayName = SelectPrimitive.Item.displayName; const SelectSeparator = React.forwardRef< React.ElementRef, @@ -149,8 +155,8 @@ const SelectSeparator = React.forwardRef< className={cn("-mx-1 my-1 h-px bg-muted", className)} {...props} /> -)) -SelectSeparator.displayName = SelectPrimitive.Separator.displayName +)); +SelectSeparator.displayName = SelectPrimitive.Separator.displayName; export { Select, @@ -163,4 +169,4 @@ export { SelectSeparator, SelectScrollUpButton, SelectScrollDownButton, -} \ No newline at end of file +}; diff --git a/frontend/public/images/background.png b/frontend/public/images/background.png deleted file mode 100644 index 66d44a8a..00000000 Binary files a/frontend/public/images/background.png and /dev/null differ diff --git a/frontend/src/app/api/queries/useDeleteSessionMutation.ts b/frontend/src/app/api/queries/useDeleteSessionMutation.ts new file mode 100644 index 00000000..996e8a44 --- /dev/null +++ b/frontend/src/app/api/queries/useDeleteSessionMutation.ts @@ -0,0 +1,57 @@ +import { + type MutationOptions, + useMutation, + useQueryClient, +} from "@tanstack/react-query"; +import type { EndpointType } from "@/contexts/chat-context"; + +interface DeleteSessionParams { + sessionId: string; + endpoint: EndpointType; +} + +interface DeleteSessionResponse { + success: boolean; + message: string; +} + +export const useDeleteSessionMutation = ( + options?: Omit< + MutationOptions, + "mutationFn" + >, +) => { + const queryClient = useQueryClient(); + + return useMutation({ + mutationFn: async ({ sessionId }: DeleteSessionParams) => { + const response = await fetch(`/api/sessions/${sessionId}`, { + method: "DELETE", + }); + + if (!response.ok) { + const errorData = await response.json().catch(() => ({})); + throw new Error( + errorData.error || `Failed to delete session: ${response.status}`, + ); + } + + return response.json(); + }, + onSettled: (_data, _error, variables) => { + // Invalidate conversations query to refresh the list + // Use a slight delay to ensure the success callback completes first + setTimeout(() => { + queryClient.invalidateQueries({ + queryKey: ["conversations", variables.endpoint], + }); + + // Also invalidate any specific conversation queries + queryClient.invalidateQueries({ + queryKey: ["conversations"], + }); + }, 0); + }, + ...options, + }); +}; diff --git a/frontend/src/app/api/queries/useGetConversationsQuery.ts b/frontend/src/app/api/queries/useGetConversationsQuery.ts new file mode 100644 index 00000000..f7e579b3 --- /dev/null +++ b/frontend/src/app/api/queries/useGetConversationsQuery.ts @@ -0,0 +1,105 @@ +import { + type UseQueryOptions, + useQuery, + useQueryClient, +} from "@tanstack/react-query"; +import type { EndpointType } from "@/contexts/chat-context"; + +export interface RawConversation { + response_id: string; + title: string; + endpoint: string; + messages: Array<{ + role: string; + content: string; + timestamp?: string; + response_id?: string; + }>; + created_at?: string; + last_activity?: string; + previous_response_id?: string; + total_messages: number; + [key: string]: unknown; +} + +export interface ChatConversation { + response_id: string; + title: string; + endpoint: EndpointType; + messages: Array<{ + role: string; + content: string; + timestamp?: string; + response_id?: string; + }>; + created_at?: string; + last_activity?: string; + previous_response_id?: string; + total_messages: number; + [key: string]: unknown; +} + +export interface ConversationHistoryResponse { + conversations: RawConversation[]; + [key: string]: unknown; +} + +export const useGetConversationsQuery = ( + endpoint: EndpointType, + refreshTrigger?: number, + options?: Omit, +) => { + const queryClient = useQueryClient(); + + async function getConversations(): Promise { + try { + // Fetch from the selected endpoint only + const apiEndpoint = + endpoint === "chat" ? "/api/chat/history" : "/api/langflow/history"; + + const response = await fetch(apiEndpoint); + + if (!response.ok) { + console.error(`Failed to fetch conversations: ${response.status}`); + return []; + } + + const history: ConversationHistoryResponse = await response.json(); + const rawConversations = history.conversations || []; + + // Cast conversations to proper type and ensure endpoint is correct + const conversations: ChatConversation[] = rawConversations.map( + (conv: RawConversation) => ({ + ...conv, + endpoint: conv.endpoint as EndpointType, + }), + ); + + // Sort conversations by last activity (most recent first) + conversations.sort((a: ChatConversation, b: ChatConversation) => { + const aTime = new Date(a.last_activity || a.created_at || 0).getTime(); + const bTime = new Date(b.last_activity || b.created_at || 0).getTime(); + return bTime - aTime; + }); + + return conversations; + } catch (error) { + console.error(`Failed to fetch ${endpoint} conversations:`, error); + return []; + } + } + + const queryResult = useQuery( + { + queryKey: ["conversations", endpoint, refreshTrigger], + placeholderData: (prev) => prev, + queryFn: getConversations, + staleTime: 0, // Always consider data stale to ensure fresh data on trigger changes + gcTime: 5 * 60 * 1000, // Keep in cache for 5 minutes + ...options, + }, + queryClient, + ); + + return queryResult; +}; diff --git a/frontend/src/app/api/queries/useGetModelsQuery.ts b/frontend/src/app/api/queries/useGetModelsQuery.ts index 4ce55bd3..3a5eb77e 100644 --- a/frontend/src/app/api/queries/useGetModelsQuery.ts +++ b/frontend/src/app/api/queries/useGetModelsQuery.ts @@ -90,7 +90,6 @@ export const useGetOllamaModelsQuery = ( queryKey: ["models", "ollama", params], queryFn: getOllamaModels, retry: 2, - enabled: !!params?.endpoint, // Only run if endpoint is provided staleTime: 0, // Always fetch fresh data gcTime: 0, // Don't cache results ...options, diff --git a/frontend/src/app/login/page.tsx b/frontend/src/app/login/page.tsx index c2347f1b..1639a4be 100644 --- a/frontend/src/app/login/page.tsx +++ b/frontend/src/app/login/page.tsx @@ -6,7 +6,9 @@ import { Suspense, useEffect } from "react"; import GoogleLogo from "@/components/logo/google-logo"; import Logo from "@/components/logo/logo"; import { Button } from "@/components/ui/button"; +import { DotPattern } from "@/components/ui/dot-pattern"; import { useAuth } from "@/contexts/auth-context"; +import { cn } from "@/lib/utils"; import { useGetSettingsQuery } from "../api/queries/useGetSettingsQuery"; function LoginPageContent() { @@ -53,15 +55,19 @@ function LoginPageContent() { } return ( -
-
+
+ +

Welcome to OpenRAG

@@ -72,7 +78,7 @@ function LoginPageContent() { Continue with Google

-
+

Systems Operational

Privacy Policy

diff --git a/frontend/src/app/onboarding/components/advanced.tsx b/frontend/src/app/onboarding/components/advanced.tsx index bb0089d5..20764aed 100644 --- a/frontend/src/app/onboarding/components/advanced.tsx +++ b/frontend/src/app/onboarding/components/advanced.tsx @@ -47,8 +47,7 @@ export function AdvancedOnboarding({ {hasEmbeddingModels && ( @@ -63,8 +62,7 @@ export function AdvancedOnboarding({ {hasLanguageModels && ( @@ -79,7 +77,7 @@ export function AdvancedOnboarding({ {(hasLanguageModels || hasEmbeddingModels) && } diff --git a/frontend/src/app/onboarding/components/ibm-onboarding.tsx b/frontend/src/app/onboarding/components/ibm-onboarding.tsx index 550f9d6b..63e3fe6a 100644 --- a/frontend/src/app/onboarding/components/ibm-onboarding.tsx +++ b/frontend/src/app/onboarding/components/ibm-onboarding.tsx @@ -1,5 +1,6 @@ import { useState } from "react"; import { LabelInput } from "@/components/label-input"; +import { LabelWrapper } from "@/components/label-wrapper"; import IBMLogo from "@/components/logo/ibm-logo"; import { useDebouncedValue } from "@/lib/debounce"; import type { OnboardingVariables } from "../../api/mutations/useOnboardingMutation"; @@ -7,6 +8,7 @@ import { useGetIBMModelsQuery } from "../../api/queries/useGetModelsQuery"; import { useModelSelection } from "../hooks/useModelSelection"; import { useUpdateSettings } from "../hooks/useUpdateSettings"; import { AdvancedOnboarding } from "./advanced"; +import { ModelSelector } from "./model-selector"; export function IBMOnboarding({ setSettings, @@ -17,10 +19,42 @@ export function IBMOnboarding({ sampleDataset: boolean; setSampleDataset: (dataset: boolean) => void; }) { - const [endpoint, setEndpoint] = useState(""); + const [endpoint, setEndpoint] = useState("https://us-south.ml.cloud.ibm.com"); const [apiKey, setApiKey] = useState(""); const [projectId, setProjectId] = useState(""); + const options = [ + { + value: "https://us-south.ml.cloud.ibm.com", + label: "https://us-south.ml.cloud.ibm.com", + default: true, + }, + { + value: "https://eu-de.ml.cloud.ibm.com", + label: "https://eu-de.ml.cloud.ibm.com", + default: false, + }, + { + value: "https://eu-gb.ml.cloud.ibm.com", + label: "https://eu-gb.ml.cloud.ibm.com", + default: false, + }, + { + value: "https://au-syd.ml.cloud.ibm.com", + label: "https://au-syd.ml.cloud.ibm.com", + default: false, + }, + { + value: "https://jp-tok.ml.cloud.ibm.com", + label: "https://jp-tok.ml.cloud.ibm.com", + default: false, + }, + { + value: "https://ca-tor.ml.cloud.ibm.com", + label: "https://ca-tor.ml.cloud.ibm.com", + default: false, + }, + ]; const debouncedEndpoint = useDebouncedValue(endpoint, 500); const debouncedApiKey = useDebouncedValue(apiKey, 500); const debouncedProjectId = useDebouncedValue(projectId, 500); @@ -68,19 +102,26 @@ export function IBMOnboarding({ return ( <>
- setEndpoint(e.target.value)} - /> + > + + - Invalid configuration or connection failed + Connection failed. Check your configuration.

)} - {modelsData && - (modelsData.language_models?.length > 0 || - modelsData.embedding_models?.length > 0) && ( -

- Configuration is valid -

- )}
} diff --git a/frontend/src/app/onboarding/components/model-selector.tsx b/frontend/src/app/onboarding/components/model-selector.tsx index 7a74bed2..dfed52ee 100644 --- a/frontend/src/app/onboarding/components/model-selector.tsx +++ b/frontend/src/app/onboarding/components/model-selector.tsx @@ -21,6 +21,9 @@ export function ModelSelector({ value, onValueChange, icon, + placeholder = "Select model...", + searchPlaceholder = "Search model...", + noOptionsPlaceholder = "No models available", }: { options: { value: string; @@ -29,6 +32,9 @@ export function ModelSelector({ }[]; value: string; icon?: React.ReactNode; + placeholder?: string; + searchPlaceholder?: string; + noOptionsPlaceholder?: string; onValueChange: (value: string) => void; }) { const [open, setOpen] = useState(false); @@ -50,7 +56,7 @@ export function ModelSelector({ > {value ? (
-
{icon}
+ {icon &&
{icon}
} {options.find((framework) => framework.value === value)?.label} {options.find((framework) => framework.value === value) ?.default && ( @@ -60,18 +66,18 @@ export function ModelSelector({ )}
) : options.length === 0 ? ( - "No models available" + noOptionsPlaceholder ) : ( - "Select model..." + placeholder )} - + - No model found. + {noOptionsPlaceholder} {options.map((option) => ( void; }) { - const [endpoint, setEndpoint] = useState(""); + const [endpoint, setEndpoint] = useState("http://localhost:11434"); + const [showConnecting, setShowConnecting] = useState(false); const debouncedEndpoint = useDebouncedValue(endpoint, 500); // Fetch models from API when endpoint is provided (debounced) @@ -41,6 +42,25 @@ export function OllamaOnboarding({ embeddingModels, } = useModelSelection(modelsData); + // Handle delayed display of connecting state + useEffect(() => { + let timeoutId: NodeJS.Timeout; + + if (debouncedEndpoint && isLoadingModels) { + timeoutId = setTimeout(() => { + setShowConnecting(true); + }, 500); + } else { + setShowConnecting(false); + } + + return () => { + if (timeoutId) { + clearTimeout(timeoutId); + } + }; + }, [debouncedEndpoint, isLoadingModels]); + const handleSampleDatasetChange = (dataset: boolean) => { setSampleDataset(dataset); }; @@ -57,74 +77,75 @@ export function OllamaOnboarding({ ); // Check validation state based on models query - const isConnecting = debouncedEndpoint && isLoadingModels; const hasConnectionError = debouncedEndpoint && modelsError; const hasNoModels = modelsData && !modelsData.language_models?.length && !modelsData.embedding_models?.length; - const isValidConnection = - modelsData && - (modelsData.language_models?.length > 0 || - modelsData.embedding_models?.length > 0); return ( <>
setEndpoint(e.target.value)} /> - {isConnecting && ( + {showConnecting && (

Connecting to Ollama server...

)} {hasConnectionError && (

- Can’t reach Ollama at {debouncedEndpoint}. Update the endpoint or + Can’t reach Ollama at {debouncedEndpoint}. Update the base URL or start the server.

)} {hasNoModels && (

- No models found. Please install some models on your Ollama server. -

- )} - {isValidConnection && ( -

- Connected successfully + No models found. Install embedding and agent models on your Ollama + server.

)}
} + noOptionsPlaceholder={ + isLoadingModels + ? "Loading models..." + : "No embedding models detected. Install an embedding model to continue." + } value={embeddingModel} onValueChange={setEmbeddingModel} /> } + noOptionsPlaceholder={ + isLoadingModels + ? "Loading models..." + : "No language models detected. Install a language model to continue." + } value={languageModel} onValueChange={setLanguageModel} /> diff --git a/frontend/src/app/onboarding/components/openai-onboarding.tsx b/frontend/src/app/onboarding/components/openai-onboarding.tsx index cf18fb53..236097a4 100644 --- a/frontend/src/app/onboarding/components/openai-onboarding.tsx +++ b/frontend/src/app/onboarding/components/openai-onboarding.tsx @@ -1,6 +1,8 @@ import { useState } from "react"; import { LabelInput } from "@/components/label-input"; +import { LabelWrapper } from "@/components/label-wrapper"; import OpenAILogo from "@/components/logo/openai-logo"; +import { Switch } from "@/components/ui/switch"; import { useDebouncedValue } from "@/lib/debounce"; import type { OnboardingVariables } from "../../api/mutations/useOnboardingMutation"; import { useGetOpenAIModelsQuery } from "../../api/queries/useGetModelsQuery"; @@ -18,6 +20,7 @@ export function OpenAIOnboarding({ setSampleDataset: (dataset: boolean) => void; }) { const [apiKey, setApiKey] = useState(""); + const [getFromEnv, setGetFromEnv] = useState(true); const debouncedApiKey = useDebouncedValue(apiKey, 500); // Fetch models from API when API key is provided @@ -26,7 +29,12 @@ export function OpenAIOnboarding({ isLoading: isLoadingModels, error: modelsError, } = useGetOpenAIModelsQuery( - debouncedApiKey ? { apiKey: debouncedApiKey } : undefined, + getFromEnv + ? { apiKey: "" } + : debouncedApiKey + ? { apiKey: debouncedApiKey } + : undefined, + { enabled: debouncedApiKey !== "" || getFromEnv }, ); // Use custom hook for model selection logic const { @@ -41,6 +49,15 @@ export function OpenAIOnboarding({ setSampleDataset(dataset); }; + const handleGetFromEnvChange = (fromEnv: boolean) => { + setGetFromEnv(fromEnv); + if (fromEnv) { + setApiKey(""); + } + setLanguageModel(""); + setEmbeddingModel(""); + }; + // Update settings when values change useUpdateSettings( "openai", @@ -53,33 +70,41 @@ export function OpenAIOnboarding({ ); return ( <> -
- setApiKey(e.target.value)} - /> - {isLoadingModels && ( -

- Validating API key... -

+
+ + + + {!getFromEnv && ( +
+ setApiKey(e.target.value)} + /> + {isLoadingModels && ( +

+ Validating API key... +

+ )} + {modelsError && ( +

+ Invalid OpenAI API key. Verify or replace the key. +

+ )} +
)} - {modelsError && ( -

- Invalid API key -

- )} - {modelsData && - (modelsData.language_models?.length > 0 || - modelsData.embedding_models?.length > 0) && ( -

- API Key is valid -

- )}
} diff --git a/frontend/src/app/onboarding/page.tsx b/frontend/src/app/onboarding/page.tsx index c58abfea..a82e5fab 100644 --- a/frontend/src/app/onboarding/page.tsx +++ b/frontend/src/app/onboarding/page.tsx @@ -4,8 +4,8 @@ import { useRouter } from "next/navigation"; import { Suspense, useEffect, useState } from "react"; import { toast } from "sonner"; import { - type OnboardingVariables, - useOnboardingMutation, + type OnboardingVariables, + useOnboardingMutation, } from "@/app/api/mutations/useOnboardingMutation"; import IBMLogo from "@/components/logo/ibm-logo"; import OllamaLogo from "@/components/logo/ollama-logo"; @@ -13,198 +13,208 @@ import OpenAILogo from "@/components/logo/openai-logo"; import { ProtectedRoute } from "@/components/protected-route"; import { Button } from "@/components/ui/button"; import { - Card, - CardContent, - CardFooter, - CardHeader, + Card, + CardContent, + CardFooter, + CardHeader, } from "@/components/ui/card"; +import { DotPattern } from "@/components/ui/dot-pattern"; import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"; import { - Tooltip, - TooltipContent, - TooltipTrigger, + Tooltip, + TooltipContent, + TooltipTrigger, } from "@/components/ui/tooltip"; +import { cn } from "@/lib/utils"; import { useGetSettingsQuery } from "../api/queries/useGetSettingsQuery"; import { IBMOnboarding } from "./components/ibm-onboarding"; import { OllamaOnboarding } from "./components/ollama-onboarding"; import { OpenAIOnboarding } from "./components/openai-onboarding"; function OnboardingPage() { - const { data: settingsDb, isLoading: isSettingsLoading } = - useGetSettingsQuery(); + const { data: settingsDb, isLoading: isSettingsLoading } = + useGetSettingsQuery(); - const redirect = "/"; + const redirect = "/"; - const router = useRouter(); + const router = useRouter(); - // Redirect if already authenticated or in no-auth mode - useEffect(() => { - if (!isSettingsLoading && settingsDb && settingsDb.edited) { - router.push(redirect); - } - }, [isSettingsLoading, settingsDb, router]); + // Redirect if already authenticated or in no-auth mode + useEffect(() => { + if (!isSettingsLoading && settingsDb && settingsDb.edited) { + router.push(redirect); + } + }, [isSettingsLoading, settingsDb, router]); - const [modelProvider, setModelProvider] = useState("openai"); + const [modelProvider, setModelProvider] = useState("openai"); - const [sampleDataset, setSampleDataset] = useState(true); + const [sampleDataset, setSampleDataset] = useState(true); - const handleSetModelProvider = (provider: string) => { - setModelProvider(provider); - setSettings({ - model_provider: provider, - embedding_model: "", - llm_model: "", - }); - }; + const handleSetModelProvider = (provider: string) => { + setModelProvider(provider); + setSettings({ + model_provider: provider, + embedding_model: "", + llm_model: "", + }); + }; - const [settings, setSettings] = useState({ - model_provider: modelProvider, - embedding_model: "", - llm_model: "", - }); + const [settings, setSettings] = useState({ + model_provider: modelProvider, + embedding_model: "", + llm_model: "", + }); - // Mutations - const onboardingMutation = useOnboardingMutation({ - onSuccess: (data) => { - toast.success("Onboarding completed successfully!"); - console.log("Onboarding completed successfully", data); - router.push(redirect); - }, - onError: (error) => { - toast.error("Failed to complete onboarding", { - description: error.message, - }); - }, - }); + // Mutations + const onboardingMutation = useOnboardingMutation({ + onSuccess: (data) => { + toast.success("Onboarding completed successfully!"); + console.log("Onboarding completed successfully", data); + router.push(redirect); + }, + onError: (error) => { + toast.error("Failed to complete onboarding", { + description: error.message, + }); + }, + }); - const handleComplete = () => { - if ( - !settings.model_provider || - !settings.llm_model || - !settings.embedding_model - ) { - toast.error("Please complete all required fields"); - return; - } + const handleComplete = () => { + if ( + !settings.model_provider || + !settings.llm_model || + !settings.embedding_model + ) { + toast.error("Please complete all required fields"); + return; + } - // Prepare onboarding data - const onboardingData: OnboardingVariables = { - model_provider: settings.model_provider, - llm_model: settings.llm_model, - embedding_model: settings.embedding_model, - sample_data: sampleDataset, - }; + // Prepare onboarding data + const onboardingData: OnboardingVariables = { + model_provider: settings.model_provider, + llm_model: settings.llm_model, + embedding_model: settings.embedding_model, + sample_data: sampleDataset, + }; - // Add API key if available - if (settings.api_key) { - onboardingData.api_key = settings.api_key; - } + // Add API key if available + if (settings.api_key) { + onboardingData.api_key = settings.api_key; + } - // Add endpoint if available - if (settings.endpoint) { - onboardingData.endpoint = settings.endpoint; - } + // Add endpoint if available + if (settings.endpoint) { + onboardingData.endpoint = settings.endpoint; + } - // Add project_id if available - if (settings.project_id) { - onboardingData.project_id = settings.project_id; - } + // Add project_id if available + if (settings.project_id) { + onboardingData.project_id = settings.project_id; + } - onboardingMutation.mutate(onboardingData); - }; + onboardingMutation.mutate(onboardingData); + }; - const isComplete = !!settings.llm_model && !!settings.embedding_model; + const isComplete = !!settings.llm_model && !!settings.embedding_model; - return ( -
-
-
-

- Configure your models -

-

[description of task]

-
- - - - - - - OpenAI - - - - IBM - - - - Ollama - - - - - - - - - - - - - - - - - - - - - - {!isComplete ? "Please fill in all required fields" : ""} - - - - -
-
- ); + return ( +
+ + +
+
+

+ Connect a model provider +

+
+ + + + + + + OpenAI + + + + IBM + + + + Ollama + + + + + + + + + + + + + + + + + + +
+ +
+
+ {!isComplete && ( + + Please fill in all required fields + + )} +
+
+
+
+
+ ); } export default function ProtectedOnboardingPage() { - return ( - - Loading onboarding...
}> - - - - ); + return ( + + Loading onboarding...
}> + + + + ); } diff --git a/frontend/src/app/settings/page.tsx b/frontend/src/app/settings/page.tsx index 7f1ca858..a63d91d3 100644 --- a/frontend/src/app/settings/page.tsx +++ b/frontend/src/app/settings/page.tsx @@ -1,6 +1,6 @@ "use client"; -import { Loader2, PlugZap, RefreshCw } from "lucide-react"; +import { ArrowUpRight, Loader2, PlugZap, RefreshCw } from "lucide-react"; import { useSearchParams } from "next/navigation"; import { Suspense, useCallback, useEffect, useState } from "react"; import { useUpdateFlowSettingMutation } from "@/app/api/mutations/useUpdateFlowSettingMutation"; @@ -35,10 +35,17 @@ import { Textarea } from "@/components/ui/textarea"; import { useAuth } from "@/contexts/auth-context"; import { useTask } from "@/contexts/task-context"; import { useDebounce } from "@/lib/debounce"; +import { DEFAULT_AGENT_SETTINGS, DEFAULT_KNOWLEDGE_SETTINGS, UI_CONSTANTS } from "@/lib/constants"; import { getFallbackModels, type ModelProvider } from "./helpers/model-helpers"; import { ModelSelectItems } from "./helpers/model-select-item"; +import { LabelWrapper } from "@/components/label-wrapper"; +import { + Tooltip, + TooltipContent, + TooltipTrigger, +} from "@radix-ui/react-tooltip"; -const MAX_SYSTEM_PROMPT_CHARS = 2000; +const { MAX_SYSTEM_PROMPT_CHARS } = UI_CONSTANTS; interface GoogleDriveFile { id: string; @@ -122,7 +129,7 @@ function KnowledgeSourcesPage() { { enabled: (isAuthenticated || isNoAuthMode) && currentProvider === "openai", - }, + } ); const { data: ollamaModelsData } = useGetOllamaModelsQuery( @@ -130,14 +137,14 @@ function KnowledgeSourcesPage() { { enabled: (isAuthenticated || isNoAuthMode) && currentProvider === "ollama", - }, + } ); const { data: ibmModelsData } = useGetIBMModelsQuery( undefined, // No params for now, could be extended later { enabled: (isAuthenticated || isNoAuthMode) && currentProvider === "ibm", - }, + } ); // Select the appropriate models data based on provider @@ -165,7 +172,7 @@ function KnowledgeSourcesPage() { (variables: Parameters[0]) => { updateFlowSettingMutation.mutate(variables); }, - 500, + 500 ); // Sync system prompt state with settings data @@ -293,7 +300,7 @@ function KnowledgeSourcesPage() { const data = await response.json(); const connections = data.connections || []; const activeConnection = connections.find( - (conn: Connection) => conn.is_active, + (conn: Connection) => conn.is_active ); const isConnected = activeConnection !== undefined; @@ -305,8 +312,8 @@ function KnowledgeSourcesPage() { status: isConnected ? "connected" : "not_connected", connectionId: activeConnection?.connection_id, } - : c, - ), + : c + ) ); } } @@ -349,7 +356,7 @@ function KnowledgeSourcesPage() { `response_type=code&` + `scope=${result.oauth_config.scopes.join(" ")}&` + `redirect_uri=${encodeURIComponent( - result.oauth_config.redirect_uri, + result.oauth_config.redirect_uri )}&` + `access_type=offline&` + `prompt=consent&` + @@ -498,7 +505,7 @@ function KnowledgeSourcesPage() { const handleEditInLangflow = ( flowType: "chat" | "ingest", - closeDialog: () => void, + closeDialog: () => void ) => { // Select the appropriate flow ID and edit URL based on flow type const targetFlowId = @@ -529,8 +536,17 @@ function KnowledgeSourcesPage() { fetch(`/api/reset-flow/retrieval`, { method: "POST", }) - .then((response) => response.json()) + .then((response) => { + if (response.ok) { + return response.json(); + } + throw new Error(`HTTP ${response.status}: ${response.statusText}`); + }) .then(() => { + // Only reset form values if the API call was successful + setSystemPrompt(DEFAULT_AGENT_SETTINGS.system_prompt); + // Trigger model update to default model + handleModelChange(DEFAULT_AGENT_SETTINGS.llm_model); closeDialog(); // Close after successful completion }) .catch((error) => { @@ -543,8 +559,17 @@ function KnowledgeSourcesPage() { fetch(`/api/reset-flow/ingest`, { method: "POST", }) - .then((response) => response.json()) + .then((response) => { + if (response.ok) { + return response.json(); + } + throw new Error(`HTTP ${response.status}: ${response.statusText}`); + }) .then(() => { + // Only reset form values if the API call was successful + setChunkSize(DEFAULT_KNOWLEDGE_SETTINGS.chunk_size); + setChunkOverlap(DEFAULT_KNOWLEDGE_SETTINGS.chunk_overlap); + setProcessingMode(DEFAULT_KNOWLEDGE_SETTINGS.processing_mode); closeDialog(); // Close after successful completion }) .catch((error) => { @@ -555,350 +580,6 @@ function KnowledgeSourcesPage() { return (
- {/* Agent Behavior Section */} - - -
-
- Agent - - Quick Agent settings. Edit in Langflow for full control. - -
-
- Restore flow} - title="Restore default Agent flow" - description="This restores defaults and discards all custom settings and overrides. This can't be undone." - confirmText="Restore" - variant="destructive" - onConfirm={handleRestoreRetrievalFlow} - /> - - - Langflow icon - - - - - Edit in Langflow - - } - title="Edit Agent flow in Langflow" - description="You're entering Langflow. You can edit the Agent flow and other underlying flows. Manual changes to components, wiring, or I/O can break this experience." - confirmText="Proceed" - onConfirm={(closeDialog) => - handleEditInLangflow("chat", closeDialog) - } - /> -
-
-
- -
-
- - -
-
- -