最小化具有线性约束和神秘边界的非凸函数

Minimizing non-convex function with linear constraint and bound in mystic

假设我有一个非凸 objective 函数 loss,它接受一个名为 Xnp.ndarray,其形状为 (n,) 和 returns一个 float 号。 objective 有很多 many many 局部最小值,因为它本质上是 np.round(X * c, 2) 的函数,其中 c是另一个形状为 (n,).

的常量数组

你可以这样想象:

def loss(X: np.ndarray) -> float:
    c = np.array([0.1, 0.5, -0.8, 7.0, 0.0])
    X_rounded = np.round(X * c, 2)
    return rosen(X_rounded)

线性约束用两个常数矩阵表示(也存储为numpy的ndarray),A其形状为(m,n)和b其形状为(m ,).我需要相对于 X 最小化 loss,同时保持 A.dot(X) == b。另外,X的每个元素都必须服从0 <= X_i <= 2,我有一个不错的初步猜测X0 = [1, 1, ..., 1]

我不需要全局最小值,搜索可以在 loss(X) <= 1 时停止。 objective 主要是用 SQL 编写的,因此速度慢得离谱,所以我还希望优化在 loss 被评估约 200 次后终止。 (这不是硬性要求,您也可以在优化 运行 持续 5 分钟后终止。)

有了scipy,我可以做到

rv = minimize(loss,
              initial_guess,
              method='SLSQP',
              bounds=[(0, 2)] * n,
              constraints={
                  'type': 'eq',
                  'fun': lambda x: A.dot(x) - b
              },
              options={
                  'maxiter': 5
              })

我对这个解决方案不满意,因为结果比人为的初始猜测(作为冒烟测试在全局最小值周围采样)更糟糕,可能是由于局部最小值的丰富性和一些数值问题?另外,我无法估计每次迭代的调用次数objective(否则我可以通过设置maxiter来限制创新次数)。

我怎样才能更好地使用 mystic,这大概更灵活?

因为我不知道Ab是什么,我就凑合一下。所以它不会和你的问题完全一样,但应该足够接近了。

让我们通过构建损失函数和约束来设置问题。可能有更好的方法来构建约束,但以下是非常通用的(尽管有点难看):

>>> import mystic as my
>>> import numpy as np
>>> from mystic.models import rosen
>>>
>>> A = np.array([[9., 0., 0., 8., -1],
...               [1., 1., -1., 0., 0.],
...               [2., -2., 6., 0., 5.]])
>>> b = np.array([18., .75, 11.5])
>>> c = np.array([0.1, 0.5, -0.8, 7.0, 0.0])
>>>
>>> def loss(x):
...     x_rounded = np.round(x * c, 2)
...     return rosen(x_rounded)
...
>>> cons = my.symbolic.linear_symbolic(A, b)
>>> cons = my.symbolic.solve(cons)
>>> cons = my.symbolic.generate_constraint(my.symbolic.generate_solvers(cons))
>>> bounds = [(0,2)] * len(c)

然后尝试求解全局最小值:

>>> stepmon = my.monitors.VerboseMonitor(1)
>>> rv = my.solvers.diffev2(loss, x0=bounds, bounds=bounds, constraints=cons, itermon=stepmon, disp=1, npop=20)
Generation 0 has ChiSquare: 15478.596962
Generation 1 has ChiSquare: 1833.714503
Generation 2 has ChiSquare: 1833.714503
Generation 3 has ChiSquare: 270.601079
Generation 4 has ChiSquare: 160.690618
Generation 5 has ChiSquare: 160.690618
Generation 6 has ChiSquare: 127.289639
Generation 7 has ChiSquare: 127.289639
Generation 8 has ChiSquare: 127.289639
Generation 9 has ChiSquare: 123.054668
Generation 10 has ChiSquare: 123.054668
Generation 11 has ChiSquare: 123.054668
Generation 12 has ChiSquare: 122.561794
Generation 13 has ChiSquare: 121.069338
Generation 14 has ChiSquare: 120.828279
Generation 15 has ChiSquare: 117.732442
Generation 16 has ChiSquare: 117.732442
Generation 17 has ChiSquare: 117.340042
Generation 18 has ChiSquare: 117.340042
Generation 19 has ChiSquare: 117.340042
Generation 20 has ChiSquare: 117.340042
Generation 21 has ChiSquare: 117.340042
Generation 22 has ChiSquare: 116.750933
Generation 23 has ChiSquare: 116.750933
Generation 24 has ChiSquare: 116.750933
Generation 25 has ChiSquare: 116.750933
Generation 26 has ChiSquare: 116.750933
Generation 27 has ChiSquare: 116.750933
Generation 28 has ChiSquare: 116.750933
Generation 29 has ChiSquare: 116.750933
Generation 30 has ChiSquare: 116.750933
Generation 31 has ChiSquare: 116.750933
Generation 32 has ChiSquare: 116.750933
Generation 33 has ChiSquare: 116.750933
Generation 34 has ChiSquare: 116.750933
Generation 35 has ChiSquare: 116.750933
Generation 36 has ChiSquare: 116.750933
Generation 37 has ChiSquare: 116.750933
Generation 38 has ChiSquare: 116.750933
Generation 39 has ChiSquare: 116.750933
Generation 40 has ChiSquare: 116.750933
Generation 41 has ChiSquare: 116.750933
Generation 42 has ChiSquare: 116.750933
Generation 43 has ChiSquare: 116.750933
Generation 44 has ChiSquare: 116.750933
Generation 45 has ChiSquare: 116.750933
Generation 46 has ChiSquare: 116.750933
Generation 47 has ChiSquare: 116.750933
Generation 48 has ChiSquare: 116.750933
Generation 49 has ChiSquare: 116.750933
Generation 50 has ChiSquare: 116.750933
Generation 51 has ChiSquare: 116.750933
STOP("VTRChangeOverGeneration with {'ftol': 0.005, 'gtol': 1e-06, 'generations': 30, 'target': 0.0}")
Optimization terminated successfully.
         Current function value: 116.750933
         Iterations: 51
         Function evaluations: 1040

>>> A.dot(rv)
array([18.  ,  0.75, 11.5 ])

可行(它可能仍不是全局最小值)...但需要一些时间。所以,让我们试试更快的本地求解器。

>>> stepmon = my.monitors.VerboseMonitor(1)
>>> rv = my.solvers.fmin_powell(loss, x0=[1]*len(c), bounds=bounds, constraints=cons, itermon=stepmon, disp=1)
Generation 0 has ChiSquare: 244559.856997
Generation 1 has ChiSquare: 116357.59447400003
Generation 2 has ChiSquare: 121.23445799999999
Generation 3 has ChiSquare: 117.635447
Generation 4 has ChiSquare: 117.59764200000001
Generation 5 has ChiSquare: 117.59764200000001
Optimization terminated successfully.
         Current function value: 117.597642
         Iterations: 5
         Function evaluations: 388
STOP("NormalizedChangeOverGeneration with {'tolerance': 0.0001, 'generations': 2}")

>>> A.dot(rv)
array([18.  ,  0.75, 11.5 ])

不错。然而,您想要限制 loss 的评估次数,并且还希望能够在 loss 接近最小值时停止...所以假设在 loss(x) <= 120 时停止。我还将函数评估的数量限制为 200.

>>> stepmon = my.monitors.VerboseMonitor(1)
>>> rv = my.solvers.fmin_powell(loss, x0=[1]*len(c), bounds=bounds, constraints=cons, itermon=stepmon, disp=1, maxfun=200, gtol=None, ftol=120)
Generation 0 has ChiSquare: 244559.856997
Generation 1 has ChiSquare: 116357.59447400003
Generation 2 has ChiSquare: 121.23445799999999
Generation 3 has ChiSquare: 117.635447
Optimization terminated successfully.
         Current function value: 117.635447
         Iterations: 3
         Function evaluations: 175
STOP("VTRChangeOverGeneration with {'ftol': 120, 'gtol': 1e-06, 'generations': 30, 'target': 0.0}")

>>> A.dot(rv)
array([18.  ,  0.75, 11.5 ])
>>> rv
array([1.93873933, 0.00381084, 1.19255017, 0.0807893 , 0.0949684 ])

如果您使用求解器的 class 界面,会更加灵活,但我会留待下次再讨论。