Recover from messed up function call #511
Replies: 1 comment 2 replies
-
@caramboleyo It's impossible for models to call functions with parameters that don't 100% conform to the params schema that you specify, the model cannot technically generate tokens that don't conform to it as it's an integral part of the inference engine (via grammar). If you want to inform the model about an error that occurred in the function handler, make sure you return a string with an indicative error of what happened so the model can understand it and act upon it. I recommend you to read the Using Function Calling and Using a JSON Schema Grammar guides, as they cover everything you need to know to use function calling and JSON schemas in If you still encounter issues with function calling, then I'd love to help you! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
All models I try mess up function calls frequently. Right now the app crashes for example when the args parsing is wrong or when the llm tries to send null or an empty response. for the model to know it did something wrong a meaningful error response might be nice. Right now the whole app crashes on this.
if i catch the whole prompt i could fake the malformed request into the chat history and rerun, but for that i need the exact response from the llm that caused the crash and there is no way to get that as all on-handlers do not trigger before the crash.
also if i fake it into the chat session how to make the llm rerun it, maybe put the failed attempt in as a system message?
I was wondering how others handle this problem. node-llama-cpp does not seem to offer a way to properly handle (or recover from) errors, it just crashes on every little thing so i am guessing i am missing something crucial here in the handling?
I would have a try/catch block around the whole function calling code in 1357 of LlamaChat.js and call an onFunctionError callback containing all the tokens that were tried to be processed as a function call and the error on catch. But i guess thats to deep in the code base because we would have to return the whole big return object from line 1683?
Beta Was this translation helpful? Give feedback.
All reactions