Skip to content

Commit 373f6b2

Browse files
SDK: Add Responses API example to error handling guide
- Show try/except around LLM.responses() Refs OpenHands/software-agent-sdk#980 Co-authored-by: openhands <[email protected]>
1 parent d490eb5 commit 373f6b2

File tree

1 file changed

+21
-0
lines changed

1 file changed

+21
-0
lines changed

sdk/guides/error-handling.mdx

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,27 @@ except LLMError as e:
6868

6969
The same exceptions are raised from both `LLM.completion()` and `LLM.responses()` paths.
7070

71+
### Example: Using the Responses API
72+
73+
```python icon="python"
74+
from pydantic import SecretStr
75+
from openhands.sdk import LLM
76+
from openhands.sdk.llm import Message, TextContent
77+
from openhands.sdk.llm.exceptions import LLMError, LLMContextWindowExceedError
78+
79+
llm = LLM(model="claude-sonnet-4-20250514", api_key=SecretStr("your-key"))
80+
81+
try:
82+
resp = llm.responses([
83+
Message.user([TextContent(text="Write a one-line haiku about code.")])
84+
])
85+
print(resp.message)
86+
except LLMContextWindowExceedError:
87+
print("Context window exceeded. Consider enabling a condenser.")
88+
except LLMError as e:
89+
print(f"LLM error: {e}")
90+
```
91+
7192
## Using agents and conversations
7293

7394
When you use `Agent` and `Conversation`, LLM exceptions propagate out of `conversation.run()` and `conversation.send_message(...)` if a condenser is not present.

0 commit comments

Comments
 (0)