Python 中基于梯度的优化

Gradient-Based Optimizations in Python

我正在尝试使用 Python 解决几个最小化问题,但我很难理解带有约束的设置。我有:

最小化: x+y+2z^2 受制于: x = 1 和 x^2+y^2 = 1

这显然很简单,我知道解是 x=1,y=0,z=0。我尝试使用 scipy.optimize.L-BFGS-B 但遇到了问题。

我还有: 最小化: 2x1^2+x2^2 受制于: x1+x2=1

我需要使用基于梯度的优化器,所以我选择了 scipy.optimizer.COBYLA 但在使用等式约束时遇到了问题,因为它只采用不等式约束。代码是:

def objective(x):
    x1 = x[0]
    x2 = x[1]
    return 2*(x1**2)+ x2
def constraint1(x):
    return x[0]+x[1]-1
#Try an initial condition of x1=1 and x2=0
#Our initial condition satisfies the constraint already
x0 =  [0.3,0.7]
print(objective(x0))
xnew = [0.25,0.75]
print(objective(xnew))
#Since we have already calculated on paper we know that x1 and x2 fall between 0 and 1
#We can set our bounds for both variables as being between 0 and 1
b = (0,1)
bounds = (b,b)
#Lets make note of the type of constraint we have for out optimizer
con1 = {'type': 'eq', 'fun':constraint1}
cons = [con1]
sol_gradient = minimize(objective,x0,method='COBYLA',bounds=bounds, constraints=cons)

然后我得到关于使用此优化器的等式约束的错误。

一些事情:

  1. 您的 objective 功能与您提供的描述不符。应该是这样的:2*(x1**2) + x2**2?
  2. 从文档 scipy.optimize.minimize 您可以看到 COBYLA 不支持 eq 作为约束。来自页面:

Note that COBYLA only supports inequality constraints.

  1. 既然你说你想使用基于梯度的优化器,一种选择是使用 Sequential Least Squares Programming (SLSQP) 优化器。

下面是将'COBYLA'替换为'SLSQP'并根据1更改objective函数的代码:

def objective(x):
    x1 = x[0]
    x2 = x[1]
    return 2*(x1**2)+ x2**2
def constraint1(x):
    return x[0]+x[1]-1
#Try an initial condition of x1=1 and x2=0
#Our initial condition satisfies the constraint already
x0 =  [0.3,0.7]
print(objective(x0))
xnew = [0.25,0.75]
print(objective(xnew))

#Since we have already calculated on paper we know that x1 and x2 fall between 0 and 1
#We can set our bounds for both variables as being between 0 and 1
b = (0,1)
bounds = (b,b)
#Lets make note of the type of constraint we have for out optimizer
con1 = {'type': 'eq', 'fun':constraint1}
cons = [con1]
sol_gradient = minimize(objective,x0,method='SLSQP',bounds=bounds, constraints=cons)
print(sol_gradient)

最终答案为:

    fun: 0.6666666666666665
     jac: array([1.33333336, 1.33333335])
 message: 'Optimization terminated successfully'
    nfev: 7
     nit: 2
    njev: 2
  status: 0
 success: True
       x: array([0.33333333, 0.66666667])