Skip to content

Conversation

nghielme
Copy link
Contributor

I extended channels last converted in order to work with models with branches.
I tested with a dummy model composed by a single fork node and with a complete UNet model with multiple fork nodes.

…e problem now seems to be to move the transpose upwards before the fork and properly reconnect the network. Not sure either what I should do concerning the model tensor shape since I see that for the base case it is modified.
@nghielme nghielme requested review from maltanar and jmitrevs April 19, 2024 09:29
_channelsLast_node_types = list(channels_last.custom_op.keys())

# Nodes, which do not modify the shape of the tensor
# And modify all values in the same way.
_move_through_nodes = ["Quant", "Relu"]
_move_through_nodes = ["Quant", "Relu", "LeakyRelu", "Resize"]
Copy link
Contributor Author

@nghielme nghielme Apr 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure that Resize should be in _move_through_nodes list since it actually modifies the shape of the tensor

@heborras heborras self-requested a review May 29, 2024 13:41
nghielme added 4 commits May 29, 2024 16:41
…e in which the transposes passes the special nodes. I added some cleaning transformation of the domain field, it is not very elegant but I think it is strictly necessary.
a removal of eventual input and  output transposes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant