-
Notifications
You must be signed in to change notification settings - Fork 437
Description
Problem
useAgentChat forces React Suspense via use() for initial message fetching. This creates two architectural constraints:
-
Fetch location = Suspense boundary location. You can't fetch messages in a parent component and handle loading/error states in a child. If the component calling
useAgentChatunmounts (e.g., panel toggle), it re-suspends on remount — there's no way to keep the fetch alive independently. -
No exported fetch function. Modern frameworks prefetch data in route loaders to eliminate waterfalls. The library's message fetch is locked inside
useAgentChat— there's no standalone function to call from a loader. It's possible to work around this by constructing the messages URL manually and feeding the result viagetInitialMessages: null+messages, but this depends on knowing the internal URL shape and isn't a supported pattern. Exporting something likedefaultGetInitialMessagesFetchwould make this a first-class workflow.
The getInitialMessages: null + messages option bypasses use(), but there's no clean public API for non-suspending consumption.
Proposal
Two additions:
-
Export a framework-agnostic fetch function that can be called in any route loader to prefetch messages before the component tree mounts.
-
Offer both non-suspending and suspending hooks, following the
useQuery/useSuspenseQueryconvention from TanStack Query.
API sketch
Just a rough sketch to illustrate the shape — not prescriptive.
// 1. Standalone fetch — use in any framework's loader
import { getAgentMessages } from '@cloudflare/ai-chat'
// TanStack Start / Router
loader: async () => {
const messages = await getAgentMessages({
host,
agent,
name,
query: { workflowId }, // same params as useAgent's query
})
return { messages }
}
// Next.js, Remix — same function, different loader shape
// 2. Non-suspending hook — consumer handles loading/error
import { useAgentChat } from '@cloudflare/ai-chat/react'
const { messages, isPending, error } = useAgentChat({
agent,
initialMessages, // from loader or cache
})
// 3. Suspending hook — current behavior, explicit opt-in
import { useSuspenseAgentChat } from '@cloudflare/ai-chat/react'
const { messages } = useSuspenseAgentChat({ agent })Benefits
- Familiar patterns. Developers already use
useQuery/useSuspenseQueryin TanStack Query and loader-based prefetching in TanStack Start, Next.js, and Remix. Same mental model, no learning curve. - Framework-agnostic. A standalone fetch function works in any framework's loader — no coupling to React Suspense or a specific router.
- Efficient loading. Messages prefetch in parallel with other loader data instead of waiting for component mount. No waterfalls.
- Decoupled UI. Consumers choose independently where to fetch and where to show loading/error states. Panel toggles, tab switches, and layout changes don't trigger refetches.
Implementation notes
The building blocks already exist internally:
defaultGetInitialMessagesFetchis the standalone fetch function — it just needs to be exported.- The
getInitialMessages: null+messagesoption already bypassesuse()— the non-suspending hook would wrap this path with{ isPending, error }tracking. useSuspenseAgentChatis the currentuseAgentChatrenamed for clarity.