Skip to content

Commit 75e83b3

Browse files
Merge pull request #615 from SciML/zygotedocs
Fix zygote constraint bug and update rosenbrock doc
2 parents e8a682e + 69df7d1 commit 75e83b3

File tree

4 files changed

+47
-15
lines changed

4 files changed

+47
-15
lines changed

docs/src/examples/rosenbrock.md

Lines changed: 41 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -22,37 +22,46 @@ _p = [1.0, 100.0]
2222
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
2323
l1 = rosenbrock(x0, _p)
2424
prob = OptimizationProblem(f, x0, _p)
25+
```
2526

2627
## Optim.jl Solvers
2728

28-
using OptimizationOptimJL
29-
30-
# Start with some derivative-free optimizers
29+
### Start with some derivative-free optimizers
3130

31+
```@example rosenbrock
32+
using OptimizationOptimJL
3233
sol = solve(prob, SimulatedAnnealing())
3334
prob = OptimizationProblem(f, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
3435
sol = solve(prob, SAMIN())
3536
3637
l1 = rosenbrock(x0, _p)
3738
prob = OptimizationProblem(rosenbrock, x0, _p)
3839
sol = solve(prob, NelderMead())
40+
```
3941

40-
# Now a gradient-based optimizer with forward-mode automatic differentiation
42+
### Now a gradient-based optimizer with forward-mode automatic differentiation
4143

44+
```@example rosenbrock
4245
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
4346
prob = OptimizationProblem(optf, x0, _p)
4447
sol = solve(prob, BFGS())
48+
```
4549

46-
# Now a second order optimizer using Hessians generated by forward-mode automatic differentiation
50+
### Now a second order optimizer using Hessians generated by forward-mode automatic differentiation
4751

52+
```@example rosenbrock
4853
sol = solve(prob, Newton())
54+
```
4955

50-
# Now a second order Hessian-free optimizer
56+
### Now a second order Hessian-free optimizer
5157

58+
```@example rosenbrock
5259
sol = solve(prob, Optim.KrylovTrustRegion())
60+
```
5361

54-
# Now derivative-based optimizers with various constraints
62+
### Now derivative-based optimizers with various constraints
5563

64+
```@example rosenbrock
5665
cons = (res, x, p) -> res .= [x[1]^2 + x[2]^2]
5766
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(); cons = cons)
5867
@@ -68,24 +77,34 @@ sol = solve(prob, IPNewton())
6877
6978
prob = OptimizationProblem(optf, x0, _p, lcons = [0.5], ucons = [0.5],
7079
lb = [-500.0, -500.0], ub = [50.0, 50.0])
71-
sol = solve(prob, IPNewton()) # Notice now that x[1]^2 + x[2]^2 ≈ 0.5:
72-
# cons(sol.u, _p) = 0.49999999999999994
80+
sol = solve(prob, IPNewton())
81+
82+
# Notice now that x[1]^2 + x[2]^2 ≈ 0.5:
83+
res = zeros(1)
84+
cons(res, sol.u, _p)
85+
println(res)
86+
```
7387

88+
```@example rosenbrock
7489
function con_c(res, x, p)
7590
res .= [x[1]^2 + x[2]^2]
7691
end
7792
7893
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(); cons = con_c)
7994
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf], ucons = [0.25^2])
8095
sol = solve(prob, IPNewton()) # -Inf < cons_circ(sol.u, _p) = 0.25^2
96+
```
8197

8298
## Evolutionary.jl Solvers
8399

100+
```@example rosenbrock
84101
using OptimizationEvolutionary
85102
sol = solve(prob, CMAES(μ = 40, λ = 100), abstol = 1e-15) # -Inf < cons_circ(sol.u, _p) = 0.25^2
103+
```
86104

87105
## IPOPT through OptimizationMOI
88106

107+
```@example rosenbrock
89108
using OptimizationMOI, Ipopt
90109
91110
function con2_c(res, x, p)
@@ -95,38 +114,48 @@ end
95114
optf = OptimizationFunction(rosenbrock, Optimization.AutoZygote(); cons = con2_c)
96115
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf, -Inf], ucons = [Inf, Inf])
97116
sol = solve(prob, Ipopt.Optimizer())
117+
```
98118

99-
# Now let's switch over to OptimizationOptimisers with reverse-mode AD
119+
## Now let's switch over to OptimizationOptimisers with reverse-mode AD
100120

121+
```@example rosenbrock
101122
using OptimizationOptimisers
102123
optf = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
103124
prob = OptimizationProblem(optf, x0, _p)
104125
sol = solve(prob, Adam(0.05), maxiters = 1000, progress = false)
126+
```
105127

106128
## Try out CMAEvolutionStrategy.jl's evolutionary methods
107129

130+
```@example rosenbrock
108131
using OptimizationCMAEvolutionStrategy
109132
sol = solve(prob, CMAEvolutionStrategyOpt())
133+
```
110134

111135
## Now try a few NLopt.jl solvers with symbolic differentiation via ModelingToolkit.jl
112136

137+
```@example rosenbrock
113138
using OptimizationNLopt, ModelingToolkit
114139
optf = OptimizationFunction(rosenbrock, Optimization.AutoModelingToolkit())
115140
prob = OptimizationProblem(optf, x0, _p)
116141
117142
sol = solve(prob, Opt(:LN_BOBYQA, 2))
118143
sol = solve(prob, Opt(:LD_LBFGS, 2))
144+
```
119145

120-
## Add some box constraints and solve with a few NLopt.jl methods
146+
### Add some box constraints and solve with a few NLopt.jl methods
121147

148+
```@example rosenbrock
122149
prob = OptimizationProblem(optf, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
123150
sol = solve(prob, Opt(:LD_LBFGS, 2))
124151
sol = solve(prob, Opt(:G_MLSL_LDS, 2), local_method = Opt(:LD_LBFGS, 2), maxiters = 10000) #a global optimizer with random starts of local optimization
152+
```
125153

126154
## BlackBoxOptim.jl Solvers
127155

156+
```@example rosenbrock
128157
using OptimizationBBO
129-
prob = Optimization.OptimizationProblem(rosenbrock, x0, _p, lb = [-1.0, 0.2],
158+
prob = Optimization.OptimizationProblem(rosenbrock, [0.0, 0.3], _p, lb = [-1.0, 0.2],
130159
ub = [0.8, 0.43])
131160
sol = solve(prob, BBO_adaptive_de_rand_1_bin()) # -1.0 ≤ x[1] ≤ 0.8, 0.2 ≤ x[2] ≤ 0.43
132161
```

docs/src/optimization_packages/prima.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,8 @@ The five Powell's algorithms of the prima library are provided by the PRIMA.jl p
2626
`COBYLA`: (Constrained Optimization BY Linear Approximations) is for general constrained problems with bound constraints, non-linear constraints, linear equality constraints, and linear inequality constraints.
2727

2828
```@example PRIMA
29+
using Optimization, OptimizationPRIMA
30+
2931
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
3032
x0 = zeros(2)
3133
_p = [1.0, 100.0]

ext/OptimizationZygoteExt.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -124,12 +124,12 @@ function Optimization.instantiate_function(f, cache::Optimization.ReInitCache,
124124
cons = nothing
125125
else
126126
cons = (res, θ) -> f.cons(res, θ, cache.p)
127-
cons_oop = (x) -> (_res = zeros(eltype(x), num_cons); cons(_res, x); _res)
127+
cons_oop = (x) -> (_res = Zygote.Buffer(x, num_cons); cons(_res, x); copy(_res))
128128
end
129129

130130
if cons !== nothing && f.cons_j === nothing
131131
cons_j = function (J, θ)
132-
J .= Zygote.jacobian(cons_oop, θ)
132+
J .= first(Zygote.jacobian(cons_oop, θ))
133133
end
134134
else
135135
cons_j = (J, θ) -> f.cons_j(J, θ, cache.p)

lib/OptimizationPRIMA/src/OptimizationPRIMA.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,8 @@ SciMLBase.allowsconstraints(::Union{LINCOA, COBYLA}) = true
1616
SciMLBase.allowsbounds(opt::Union{BOBYQA, LINCOA, COBYLA}) = true
1717
SciMLBase.requiresconstraints(opt::COBYLA) = true
1818

19-
function Optimization.OptimizationCache(prob::SciMLBase.OptimizationProblem, opt::PRIMASolvers, data;
19+
function Optimization.OptimizationCache(prob::SciMLBase.OptimizationProblem,
20+
opt::PRIMASolvers, data;
2021
callback = Optimization.DEFAULT_CALLBACK,
2122
maxiters::Union{Number, Nothing} = nothing,
2223
maxtime::Union{Number, Nothing} = nothing,

0 commit comments

Comments
 (0)