如何在 cyipopt 中将选项传递给 objective
How to pass options to objective in cyipopt
我正在尝试将选项传递给我在 cyipopt 中解决的 NLP。
这些选项会在每次迭代中以相同的方式影响 objective。比如教程的问题是最小化
x_1 * x_4 * (x_1 + x_2 + _3) + x_3
受某些限制(参见 https://pythonhosted.org/ipopt/tutorial.html)。
我想解决相关问题
规模* x_1 * x_4 * (x_1 + x_2 + _3) + x_3
其中比例是优化前设置的参数。下面的代码显示了如何在pyipopt中设置问题,但是比例硬编码为2。如何将其设置为选项以便灵活更改?
import ipopt
import numpy as np
class hs071(object):
def __init__(self):
pass
def objective(self, x, scale):
# The callback for calculating the objective
scale = 2
return scale * x[0] * x[3] * np.sum(x[0:3]) + x[2]
def gradient(self, x, scale):
# The callback for calculating the gradient
scale = 2
return np.array([
scale * x[0] * x[3] + scale * x[3] * np.sum(x[0:3]),
scale * x[0] * x[3],
scale * x[0] * x[3] + 1.0,
scale * x[0] * np.sum(x[0:3])
])
def constraints(self, x):
# The callback for calculating the constraints
return np.array((np.prod(x), np.dot(x, x)))
def jacobian(self, x):
# The callback for calculating the Jacobian
return np.concatenate((np.prod(x) / x, 2*x))
x0 = [1.0, 5.0, 5.0, 1.0]
lb = [1.0, 1.0, 1.0, 1.0]
ub = [5.0, 5.0, 5.0, 5.0]
cl = [25.0, 40.0]
cu = [2.0e19, 40.0]
nlp = ipopt.problem(
n=len(x0),
m=len(cl),
problem_obj=hs071(),
lb=lb,
ub=ub,
cl=cl,
cu=cu
)
x, info = nlp.solve(x0)
注意:定义全局变量有效但草率。必须有一种更简洁的方法来执行此操作,因为这是将数据添加到优化问题的方法。
将它们添加到 class 本身:
import ipopt
import numpy as np
class hs071(object):
def __init__(self):
pass
def objective(self, x):
# The callback for calculating the objective
scale = self.scale
return scale * x[0] * x[3] * np.sum(x[0:3]) + x[2]
def gradient(self, x):
# The callback for calculating the gradient
scale = self.scale
return np.array([
scale * x[0] * x[3] + scale * x[3] * np.sum(x[0:3]),
scale * x[0] * x[3],
scale * x[0] * x[3] + 1.0,
scale * x[0] * np.sum(x[0:3])
])
def constraints(self, x):
# The callback for calculating the constraints
return np.array((np.prod(x), np.dot(x, x)))
def jacobian(self, x):
# The callback for calculating the Jacobian
return np.concatenate((np.prod(x) / x, 2*x))
x0 = [1.0, 5.0, 5.0, 1.0]
lb = [1.0, 1.0, 1.0, 1.0]
ub = [5.0, 5.0, 5.0, 5.0]
cl = [25.0, 40.0]
cu = [2.0e19, 40.0]
model = hs071()
model.scale = 2
nlp = ipopt.problem(
n=len(x0),
m=len(cl),
problem_obj=model,
lb=lb,
ub=ub,
cl=cl,
cu=cu
)
x, info = nlp.solve(x0)
我正在尝试将选项传递给我在 cyipopt 中解决的 NLP。
这些选项会在每次迭代中以相同的方式影响 objective。比如教程的问题是最小化
x_1 * x_4 * (x_1 + x_2 + _3) + x_3
受某些限制(参见 https://pythonhosted.org/ipopt/tutorial.html)。
我想解决相关问题
规模* x_1 * x_4 * (x_1 + x_2 + _3) + x_3
其中比例是优化前设置的参数。下面的代码显示了如何在pyipopt中设置问题,但是比例硬编码为2。如何将其设置为选项以便灵活更改?
import ipopt
import numpy as np
class hs071(object):
def __init__(self):
pass
def objective(self, x, scale):
# The callback for calculating the objective
scale = 2
return scale * x[0] * x[3] * np.sum(x[0:3]) + x[2]
def gradient(self, x, scale):
# The callback for calculating the gradient
scale = 2
return np.array([
scale * x[0] * x[3] + scale * x[3] * np.sum(x[0:3]),
scale * x[0] * x[3],
scale * x[0] * x[3] + 1.0,
scale * x[0] * np.sum(x[0:3])
])
def constraints(self, x):
# The callback for calculating the constraints
return np.array((np.prod(x), np.dot(x, x)))
def jacobian(self, x):
# The callback for calculating the Jacobian
return np.concatenate((np.prod(x) / x, 2*x))
x0 = [1.0, 5.0, 5.0, 1.0]
lb = [1.0, 1.0, 1.0, 1.0]
ub = [5.0, 5.0, 5.0, 5.0]
cl = [25.0, 40.0]
cu = [2.0e19, 40.0]
nlp = ipopt.problem(
n=len(x0),
m=len(cl),
problem_obj=hs071(),
lb=lb,
ub=ub,
cl=cl,
cu=cu
)
x, info = nlp.solve(x0)
注意:定义全局变量有效但草率。必须有一种更简洁的方法来执行此操作,因为这是将数据添加到优化问题的方法。
将它们添加到 class 本身:
import ipopt
import numpy as np
class hs071(object):
def __init__(self):
pass
def objective(self, x):
# The callback for calculating the objective
scale = self.scale
return scale * x[0] * x[3] * np.sum(x[0:3]) + x[2]
def gradient(self, x):
# The callback for calculating the gradient
scale = self.scale
return np.array([
scale * x[0] * x[3] + scale * x[3] * np.sum(x[0:3]),
scale * x[0] * x[3],
scale * x[0] * x[3] + 1.0,
scale * x[0] * np.sum(x[0:3])
])
def constraints(self, x):
# The callback for calculating the constraints
return np.array((np.prod(x), np.dot(x, x)))
def jacobian(self, x):
# The callback for calculating the Jacobian
return np.concatenate((np.prod(x) / x, 2*x))
x0 = [1.0, 5.0, 5.0, 1.0]
lb = [1.0, 1.0, 1.0, 1.0]
ub = [5.0, 5.0, 5.0, 5.0]
cl = [25.0, 40.0]
cu = [2.0e19, 40.0]
model = hs071()
model.scale = 2
nlp = ipopt.problem(
n=len(x0),
m=len(cl),
problem_obj=model,
lb=lb,
ub=ub,
cl=cl,
cu=cu
)
x, info = nlp.solve(x0)