Skip to content

Conversation

kompfner
Copy link
Contributor

No description provided.

Copy link

codecov bot commented Sep 26, 2025

Codecov Report

❌ Patch coverage is 3.49650% with 138 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/pipecat/services/aws_nova_sonic/aws.py 0.00% 78 Missing ⚠️
...ipecat/adapters/services/aws_nova_sonic_adapter.py 0.00% 58 Missing ⚠️
src/pipecat/processors/aggregators/llm_context.py 71.42% 2 Missing ⚠️
Files with missing lines Coverage Δ
src/pipecat/processors/aggregators/llm_context.py 52.94% <71.42%> (+0.85%) ⬆️
...ipecat/adapters/services/aws_nova_sonic_adapter.py 0.00% <0.00%> (ø)
src/pipecat/services/aws_nova_sonic/aws.py 0.00% <0.00%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@kompfner kompfner force-pushed the pk/aws-nova-sonic-universal-llmcontext-1 branch from ec0baa6 to 4d9873b Compare September 29, 2025 14:28
@kompfner kompfner marked this pull request as ready for review September 29, 2025 14:30
)
return filtered_messages

def get_messages_for_persistent_storage(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is new functionality on LLMContext. There are corresponding unit tests in this PR.

The reason for the system_instruction argument is to support usage with LLMs where you might pass the system instruction as a parameter to the LLMService rather than via the context.

Pending: I'll update some examples to highlight this functionality.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm...as I think about this more I'm not sure we need the complexity of the system_instruction argument...

  • If you specified your system instruction in the context in the first place, it'll still be there when you read messages for persistent storage
  • If you didn't specify your system instruction in the context and instead passed it in as an LLM service parameter, you probably don't want it to be in the context when you read messages for persistent storage
  • ...and if you really really do need to inject it at the start of the context, it's quite easy to do yourself

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which means...we probably don't need this method at all. We can just rely on get_messages().

The reason for its `system_instruction` argument was to support usage with LLMs where you might pass the system instruction as a parameter to the `LLMService` rather than specifying it in the context.

But as I thought about it more I became unconvinced that the `system_instruction` argument was really beneficial:

- If you specified your system instruction in your context in the first place, it'll still be there when you read messages for persistent storage
- If you didn't specify your system instruction in the context and instead passed it in as an `LLMService` parameter, you most likely *don't* want it to be in the context when you read messages for persistent storage
- ...and if you really really do need to inject it at the start of the context, it's quite easy to do anyway

And if we remove the `system_instruction` argument from `get_messages_for_persistent_storage()`, then it's essentially just `get_messages()`.
tools: Available tools/functions for the model to use.
send_transcription_frames: Whether to emit transcription frames.
.. deprecated:: 0.0.87
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note to self: update this if this doesn't make the 0.0.87 release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant