We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent dc6fc3c commit 9c5366cCopy full SHA for 9c5366c
src/Native/LibTorchSharp/THSJIT.cpp
@@ -74,7 +74,7 @@ void THSJIT_Module_zero_grad(const JITModule module, bool set_to_none)
74
// torch::jit::Module has no zero_grad().
75
// As a workaround, manually loop over the parameters and zero them out like optimizer does;
76
// https://github.com/pytorch/pytorch/blob/v2.5.1/torch/csrc/api/src/optim/optimizer.cpp#L123
77
- for (auto& p : (*module)->parameters()) {
+ for (const auto& p : (*module)->parameters()) {
78
if (p.mutable_grad().defined()) {
79
p.mutable_grad().detach_();
80
if (set_to_none)
0 commit comments