Skip to content

Conversation

@FranciscoMoretti
Copy link

Moving context to message metadata

Message metadata is a better fit:

  • Each context is related to an assistant response
  • Removing/editing a message from the conversation would not require context cleanup on the chat
  • Loading context from history is free (comes with loading messages)

NOTE: Haven't been able to verify these changes because I'm getting a few db errors. I think I need to setup this project again.

@vercel
Copy link

vercel bot commented Sep 10, 2025

@FranciscoMoretti is attempting to deploy a commit to the Vercel Team on Vercel.

A member of the Team first needs to authorize it.

createdAt: new Date(),
attachments: [],
chatId: id,
lastContext: null,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The lastContext field is hardcoded to null when saving messages, but the client receives usage data through messageMetadata, creating a disconnect between stored and displayed data.

View Details
📝 Patch Details
diff --git a/app/(chat)/api/chat/route.ts b/app/(chat)/api/chat/route.ts
index 05df486..fad9af8 100644
--- a/app/(chat)/api/chat/route.ts
+++ b/app/(chat)/api/chat/route.ts
@@ -150,6 +150,9 @@ export async function POST(request: Request) {
     const streamId = generateUUID();
     await createStreamId({ streamId, chatId: id });
 
+    // Store usage data to be captured from streamText
+    let totalUsage: any = null;
+
     const stream = createUIMessageStream({
       execute: ({ writer: dataStream }) => {
         const result = streamText({
@@ -180,6 +183,10 @@ export async function POST(request: Request) {
             isEnabled: isProductionEnvironment,
             functionId: 'stream-text',
           },
+          onFinish: ({ totalUsage: usage }) => {
+            // Capture usage data from streamText for later storage
+            totalUsage = usage;
+          },
         });
 
         result.consumeStream();
@@ -210,7 +217,7 @@ export async function POST(request: Request) {
             createdAt: new Date(),
             attachments: [],
             chatId: id,
-            lastContext: null,
+            lastContext: message.role === 'assistant' ? totalUsage : null,
           })),
         });
       },

Analysis

Usage Data Loss Bug in Chat API

Summary

The chat API is losing usage data (token counts) when messages are persisted to the database, creating a disconnect between what users see during active sessions and what's available after page refreshes.

Technical Details

The Problem

In app/(chat)/api/chat/route.ts, the onFinish callback of createUIMessageStream hardcodes lastContext: null when saving messages to the database (line 213), despite having access to usage data from the underlying streamText operation.

onFinish: async ({ messages }) => {
  await saveMessages({
    messages: messages.map((message) => ({
      id: message.id,
      role: message.role,
      parts: message.parts,
      createdAt: new Date(),
      attachments: [],
      chatId: id,
      lastContext: null, // ❌ Hardcoded null loses usage data
    })),
  });
},

Data Flow Analysis

  1. During streaming: The messageMetadata function (lines 189-198) correctly sends usage data to clients via part.totalUsage
  2. During persistence: lastContext is hardcoded to null, losing usage information
  3. On reload: convertToUIMessages maps message.lastContext to metadata.usage, resulting in undefined instead of actual usage data

Database Schema Evidence

The database schema confirms lastContext is designed to store usage data:

lastContext: jsonb('lastContext').$type<LanguageModelV2Usage | null>()

API Documentation Verification

According to the AI SDK documentation, streamText's onFinish callback provides a totalUsage parameter containing token usage information. The current implementation fails to capture this data.

Impact

  • User Experience: Usage information disappears after page refresh/reload
  • Analytics: Lost token usage data prevents accurate billing and usage tracking
  • Debugging: Difficulty troubleshooting expensive API calls without persistent usage data

Solution Implemented

The fix captures usage data from streamText's onFinish callback and stores it in the database:

  1. Added a variable to capture totalUsage from the streamText result
  2. Used onFinish callback on streamText to capture the usage data
  3. Modified the message persistence to store usage data for assistant messages
// Capture usage data from streamText
onFinish: ({ totalUsage: usage }) => {
  totalUsage = usage;
},

// Store usage data for assistant messages
lastContext: message.role === 'assistant' ? totalUsage : null,

Verification

  • ✅ TypeScript compilation passes
  • ✅ Next.js build succeeds
  • ✅ Code follows existing patterns in the codebase
  • ✅ Database schema supports the fix
  • ✅ API documentation confirms usage data availability

This fix ensures usage data persists across sessions while maintaining backward compatibility with existing null values for user messages.

@vercel
Copy link

vercel bot commented Sep 10, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
ai-chatbot Ready Ready Preview Comment Sep 10, 2025 4:01pm

@FranciscoMoretti
Copy link
Author

@dancer review pls

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant