You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
res = DiffEqFlux.sciml_train(loss,pinit,ADAM(), maxiters =1000)
70
+
71
+
# res = DiffEqFlux.sciml_train(loss,pinit,BFGS(), maxiters = 1000) ### errors!
72
+
73
+
#try Newton method of optimization
74
+
res = DiffEqFlux.sciml_train(loss,pinit,Newton(), GalacticOptim.AutoForwardDiff())
70
75
```
76
+
77
+
You might notice that `AutoZygote` (default) fails for the above `sciml_train` call with Optim's optimizers which happens because
78
+
of Zygote's behaviour for zero gradients in which case it returns `nothing`. To avoid such issue you can just use a different version of the same check which compares the size of the obtained
79
+
solution and the data we have, shown below, which is easier to AD.
0 commit comments