SciPy 优化中的非线性约束
NonlinearConstraints in SciPy Optimize
我正在尝试使用 SciPy 中的优化模块,只是编写简短的试用程序。当存在线性约束时,我可以获得解决方案,但 Hessian 定义不起作用。我在 this site 上使用了示例,但是当我尝试不使用内置 Rosenberg 函数及其 hessian 时出现错误。
也尝试了网上找到的一个简单问题,我的代码是:
import numpy as np
from scipy import optimize
from scipy.optimize import NonlinearConstraint
def fun(x):
return x[0]**2+x[1]**2-8*x[1]+16
bounds = optimize.Bounds([0,0,0],[np.inf,np.inf,np.inf])
def cons_f(x):
return x[0]**2+x[1]**2+x[2]
def cons_J(x):
return [2*x[0],2*x[1],1]
def cons_H(x,v):
return v[0]*[2,2,0]
nonlinear_constraint = optimize.NonlinearConstraint(cons_f, -np.inf, 6, jac=cons_J, hess=cons_H)
x0=[1,1]
res = optimize.minimize(fun, x0, method='trust-constr', jac=cons_J, hess=cons_H,
constraints=[nonlinear_constraint],
options={'verbose': 1}, bounds=bounds)
print(res.x)
我在这两种情况下都收到以下错误:
Traceback (most recent call last):
File "C:\Users\user\OneDrive - EOP\Escritorio\Test.py", line 19, in <module>
res = optimize.minimize(fun, x0, method='trust-constr', jac=cons_J, hess=cons_H,
File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_minimize.py", line 634, in minimize
return _minimize_trustregion_constr(fun, x0, args, jac, hess, hessp,
File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 332, in _minimize_trustregion_constr
objective = ScalarFunction(fun, x0, args, grad, hess,
File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_differentiable_functions.py", line 163, in __init__
self.H = hess(np.copy(x0), *args)
TypeError: cons_H() missing 1 required positional argument: 'v'
这里有几个问题:
- 通过设置
jac=cons_J
和 hess=cons_H
,您将约束函数的导数用作 objective 导数,这可能不是您想要的。
- 约束hessian
cons_H
是错误的。
- 您的约束函数是三个变量的函数,但您的初始猜测
x0
让 minimize
认为您有两个变量的优化问题。
解决所有问题后,您的代码可能如下所示:
import numpy as np
from scipy.optimize import Bounds, minimize, NonlinearConstraint
# objective and derivatives
def fun(x):
return x[0]**2+x[1]**2-8*x[1]+16
def grad(x):
return np.array([2*x[0], 2*x[1]-8, 0])
def hess(x):
return np.array([[2, 0, 0], [0, 2, 0], [0, 0, 0]])
# constraint function and derivatives
def cons_f(x): return x[0]**2+x[1]**2+x[2]
def cons_J(x): return [2*x[0],2*x[1],1]
def cons_H(x,v): return v[0]*np.array([[2, 0, 0], [0, 2, 0], [0, 0, 0]])
# variable bounds
bounds = Bounds([0,0,0],[np.inf,np.inf,np.inf])
# constraint
con = NonlinearConstraint(cons_f, -np.inf, 6, jac=cons_J, hess=cons_H)
# initial guess
x0=[1,1,1]
res = minimize(fun, x0, method='trust-constr', jac=grad, hess=hess,
constraints=[con], bounds=bounds)
我正在尝试使用 SciPy 中的优化模块,只是编写简短的试用程序。当存在线性约束时,我可以获得解决方案,但 Hessian 定义不起作用。我在 this site 上使用了示例,但是当我尝试不使用内置 Rosenberg 函数及其 hessian 时出现错误。
也尝试了网上找到的一个简单问题,我的代码是:
import numpy as np
from scipy import optimize
from scipy.optimize import NonlinearConstraint
def fun(x):
return x[0]**2+x[1]**2-8*x[1]+16
bounds = optimize.Bounds([0,0,0],[np.inf,np.inf,np.inf])
def cons_f(x):
return x[0]**2+x[1]**2+x[2]
def cons_J(x):
return [2*x[0],2*x[1],1]
def cons_H(x,v):
return v[0]*[2,2,0]
nonlinear_constraint = optimize.NonlinearConstraint(cons_f, -np.inf, 6, jac=cons_J, hess=cons_H)
x0=[1,1]
res = optimize.minimize(fun, x0, method='trust-constr', jac=cons_J, hess=cons_H,
constraints=[nonlinear_constraint],
options={'verbose': 1}, bounds=bounds)
print(res.x)
我在这两种情况下都收到以下错误:
Traceback (most recent call last):
File "C:\Users\user\OneDrive - EOP\Escritorio\Test.py", line 19, in <module>
res = optimize.minimize(fun, x0, method='trust-constr', jac=cons_J, hess=cons_H,
File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_minimize.py", line 634, in minimize
return _minimize_trustregion_constr(fun, x0, args, jac, hess, hessp,
File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 332, in _minimize_trustregion_constr
objective = ScalarFunction(fun, x0, args, grad, hess,
File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_differentiable_functions.py", line 163, in __init__
self.H = hess(np.copy(x0), *args)
TypeError: cons_H() missing 1 required positional argument: 'v'
这里有几个问题:
- 通过设置
jac=cons_J
和hess=cons_H
,您将约束函数的导数用作 objective 导数,这可能不是您想要的。 - 约束hessian
cons_H
是错误的。 - 您的约束函数是三个变量的函数,但您的初始猜测
x0
让minimize
认为您有两个变量的优化问题。
解决所有问题后,您的代码可能如下所示:
import numpy as np
from scipy.optimize import Bounds, minimize, NonlinearConstraint
# objective and derivatives
def fun(x):
return x[0]**2+x[1]**2-8*x[1]+16
def grad(x):
return np.array([2*x[0], 2*x[1]-8, 0])
def hess(x):
return np.array([[2, 0, 0], [0, 2, 0], [0, 0, 0]])
# constraint function and derivatives
def cons_f(x): return x[0]**2+x[1]**2+x[2]
def cons_J(x): return [2*x[0],2*x[1],1]
def cons_H(x,v): return v[0]*np.array([[2, 0, 0], [0, 2, 0], [0, 0, 0]])
# variable bounds
bounds = Bounds([0,0,0],[np.inf,np.inf,np.inf])
# constraint
con = NonlinearConstraint(cons_f, -np.inf, 6, jac=cons_J, hess=cons_H)
# initial guess
x0=[1,1,1]
res = minimize(fun, x0, method='trust-constr', jac=grad, hess=hess,
constraints=[con], bounds=bounds)