-
Hi Team I have a Next.js 15 app using AI SDK v5 with OpenRouter that shows streaming responses working perfectly on the server side (console shows all text-delta parts), but the UI displays the complete response all at once instead of streaming word-by-word. Tech Stack Server-Side Evidence (Working) 🔥 STREAM PART: {"type":"text-delta","id":"gen-xxx","delta":"Hi "} Client-Side Issue UI shows complete response instantly instead of streaming API Route: const stream = createUIMessageStream({
} return new Response( Client: const { messages, status } = useChat({ Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
This is mostly likely a network setup issue. Check out these docs: |
Beta Was this translation helpful? Give feedback.
Thanks for reply i found the issue it was React.memo components/message.tsx blocked the UI updates! it was set return true; i changed it to return false that fixed it