Add support in prepareStep for changing the model temperature, maxOutputTokens, topP
              
              #8712
            
            
          Replies: 1 comment
-
| 
         This discussion was automatically locked because it has not been updated in over 30 days. If you still have questions about this topic, please ask us at community.vercel.com/ai-sdk  | 
  
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
I have different values for
temperatureetc based on the model. I would like to return this data inprepareStepso I can change it based on the current model. This is very important when the model is changed in a step in the middle of a message, otherwise you can't really update the model because parameters like temperature cannot be changedBeta Was this translation helpful? Give feedback.
All reactions