Optimization v5.1.0
Merged pull requests:
- Rewrite the progressbar part of
OptimizationOptimisers(#1060) (@prbzrg) - Add OptimizationMadNLP (#1061) (@SebastianM-C)
- CompatHelper: bump compat for OptimizationNLPModels to 1 for package docs, (keep existing compat) (#1063) (@github-actions[bot])
- CompatHelper: bump compat for Optimization to 5 for package docs, (keep existing compat) (#1064) (@github-actions[bot])
- Better handling of Base method movement (#1066) (@ChrisRackauckas)
- Clean up and update to OptimizationBase@v4 (#1067) (@SebastianM-C)
- Fix NLopt crash with gradient-based algorithms when no AD backend specified (#1068) (@ChrisRackauckas-Claude)
- Bugfix (#1069) (@SebastianM-C)
- Cleanup for OptimizationBase (#1071) (@SebastianM-C)
- Fix deprecated field names in MultistartOptimization result handling (#1072) (@ChrisRackauckas-Claude)
- Fix LBFGS/BFGS callback receiving Dual numbers instead of scalar loss values (#1075) (@ChrisRackauckas-Claude)
- Use the current OptimizationBase instead of the release (#1080) (@SebastianM-C)
- Skip gradient updates when gradients contain NaN or Inf (#1081) (@ChrisRackauckas-Claude)
- shorten the OptimizationMadNLP tests (#1082) (@SebastianM-C)
- Continue fixing CI (#1083) (@SebastianM-C)
- Fix OptimizationMadNLP version (#1084) (@SebastianM-C)
- add missing bumps (#1086) (@SebastianM-C)
Closed issues:
- OptimizationBase (v2) fails to precompile (#1056)
- [ERROR]
@logprogressmust be used inside@withprogressor with_idkeyword argument (#1059) - OptimizationPRIMA v0.3.2 fails to precompile (#1062)
- 1 dependency had output during precompilation (#1065)
- Did Optimization v5 break Optimization v4 somehow? (#1070)
- (L-)BFGS with bounds reports negatives loss to callback (#1073)