Skip to content

Conversation

anasashb
Copy link

This pull request fixes a bug / typo in layers/Embed.py on lines 13 and 50.

Torch tensors do not have an attribute called require_grad, and therefore pe.require_grad = False and w.require_grad = False calls were resulting in the insertion of an additional require_grad attribute to the instantiated tensor, that did nothing.

The correct attribute to track gradients is Torch is called requires_grad, and in this PR both lines (13 and 50) are changed to reflect this.

The existing bug / typo is not expected to have affected any code behavior, as when a tensor is created using torch.zeros, it by default comes with requires_grad attribute set to False anyways. But for the sake of clarity and preventing any future confusion it is worthwhile to fix the bug anyways.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant