Skip to content

Conversation

hipsterusername
Copy link
Member

Summary

This exposes the tile size and overlap parameter for more controllable upscaling settings. This helps with manually solving for AMD OOM issues without changing functionality for others.

Related Issues / Discussions

#6981

QA Instructions

Test upscaling SD 1.5 on an AMD card.

Merge Plan

N/A

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • Documentation added / updated (if applicable)
  • Updated What's New copy (if doing a release after this PR)

@github-actions github-actions bot added the frontend PRs that change frontend files label Jul 12, 2025
Copy link
Contributor

@heathen711 heathen711 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worked for me.

My 2c: Maybe we need to add a ROCM section to the FAQ for these kinds of things to be tracked? I'm sure I'll find other things in the future...

@hipsterusername
Copy link
Member Author

Worked for me.

My 2c: Maybe we need to add a ROCM section to the FAQ for these kinds of things to be tracked? I'm sure I'll find other things in the future...

Could see that being super helpful. Do you have a list of the ROCm stuff that you’ve navigated through already?

@heathen711
Copy link
Contributor

Worked for me.
My 2c: Maybe we need to add a ROCM section to the FAQ for these kinds of things to be tracked? I'm sure I'll find other things in the future...

Could see that being super helpful. Do you have a list of the ROCm stuff that you’ve navigated through already?

So far its been install issues, but I'm addressing those in the rocm docker PR, there's two versions in there, the full version will need it's own documentation once it's approved, so I'd want to link that there and/or at the docker guide.
#8152

I also found that https://invoke-ai.github.io/InvokeAI/features/low-vram/#pytorch-cuda-allocator-config does not work for rocm (for obvious reasons) but it's not obvious without looking at the logs (the UI just says it failed), so maybe a note on that for rocm users.

These all seem like they have homes in other pages, but I was thinking this is minor and should just be a FAQ. Writing this makes me feel like the guide need to branch, cpu vs cuda vs rocm. Which is out of scope for this PR. So just the simple FAQ to start, and polish as time goes on.

@hipsterusername
Copy link
Member Author

Worked for me.
My 2c: Maybe we need to add a ROCM section to the FAQ for these kinds of things to be tracked? I'm sure I'll find other things in the future...

Could see that being super helpful. Do you have a list of the ROCm stuff that you’ve navigated through already?

So far its been install issues, but I'm addressing those in the rocm docker PR, there's two versions in there, the full version will need it's own documentation once it's approved, so I'd want to link that there and/or at the docker guide. #8152

I also found that https://invoke-ai.github.io/InvokeAI/features/low-vram/#pytorch-cuda-allocator-config does not work for rocm (for obvious reasons) but it's not obvious without looking at the logs (the UI just says it failed), so maybe a note on that for rocm users.

These all seem like they have homes in other pages, but I was thinking this is minor and should just be a FAQ. Writing this makes me feel like the guide need to branch, cpu vs cuda vs rocm. Which is out of scope for this PR. So just the simple FAQ to start, and polish as time goes on.

Aye - this will also probably change a bit as ROCm gets live for windows and deeper torch support

Open to you tackling some of the ROCm stuff you’d recommend - else we’ll just be summarizing what you’ve found already hehe

@hipsterusername hipsterusername force-pushed the expose-tile-size branch 2 times, most recently from f728923 to 57c1e05 Compare July 13, 2025 00:11
Copy link
Collaborator

@psychedelicious psychedelicious left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good just 2 minor things

@psychedelicious psychedelicious enabled auto-merge (rebase) July 17, 2025 04:19
@psychedelicious psychedelicious merged commit 79f65e5 into main Jul 17, 2025
12 checks passed
@psychedelicious psychedelicious deleted the expose-tile-size branch July 17, 2025 04:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend PRs that change frontend files

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants