python - Constrained Optimization with Scipy for a nonlinear fucntion -
i trying maximize x^(0.5)y^(0.5)
st. x+y=10
using scipy.
i can't figure out method use. appreciate if guide me on this.
here 2 possible ways:
the first version uses fmin_cobyla
, therefore not require derivative of f
.
from scipy.optimize import fmin_cobyla f = lambda x : - (x[0]**0.5 * x[1]**(0.5)) # x + y = 10 <=> (x + y - 10 >= 0) & (-x -y + 10 >= 0) c1 = lambda x: x[0] + x[1] - 10 c2 = lambda x: 10 - x[0] - x[1] fmin_cobyla(f, [0,0], cons=(c1, c2))
and get: array([ 4.9999245, 5.0000755])
the second version uses fmin_slsqp
, exploits can calculate partial derivatives analytically:
from scipy.optimize import fmin_slsqp f = lambda x : - (x[0]**0.5 * x[1]**(0.5)) def f_prime(x): ddx1 = 0.5 * x[0]**-0.5 * x[1]**0.5 ddx2 = 0.5 * x[1]**-0.5 * x[0]**0.5 return [ddx1, ddx2] f_eq = lambda x: x[0] + x[1] - 10 fmin_slsqp(f, [0.01,0.01], fprime=f_prime, f_eqcons=f_eq)
this output:
optimization terminated successfully. (exit mode 0) current function value: -5.0 iterations: 2 function evaluations: 2 gradient evaluations: 2
Comments
Post a Comment