是否可以向量化 scipy.optimize.fminbound?

Is it possible to vectorize scipy.optimize.fminbound?

我有一些按时间参数化的轨迹数据点,我想知道每个点到适合它们的曲线的最短距离。似乎有几种方法可以解决这个问题(例如 here or here), but I chose scipy.optimize.fminbound, as proposed here(因为它似乎是最巧妙的方法,而且因为我确实让它起作用了):

import numpy as np
import pandas as pd
from scipy.optimize import fminbound


data = np.array(
    [
        (212.82275865, 1650.40828168, 0., 0),
        (214.22056952, 1649.99898924, 10.38, 0),
        (212.86786868, 1644.25228805, 116.288, 0),
        (212.78680031, 1643.87461108, 122.884, 0),
        (212.57489485, 1642.01124032, 156.313, 0),
        (212.53483954, 1641.61858242, 162.618, 0),
        (212.43922274, 1639.58782771, 196.314, 0),
        (212.53726315, 1639.13842423, 202.619, 0),
        (212.2888428, 1637.24641296, 236.306, 0),
        (212.2722447, 1636.92307229, 242.606, 0),
        (212.15559302, 1635.0529813, 276.309, 0),
        (212.17535631, 1634.60618711, 282.651, 0),
        (212.02545613, 1632.72139574, 316.335, 0),
        (211.99988779, 1632.32053329, 322.634, 0),
        (211.33419846, 1631.07592039, 356.334, 0),
        (211.58972239, 1630.21971902, 362.633, 0),
        (211.70332122, 1628.2088542, 396.316, 0),
        (211.74610735, 1627.67591368, 402.617, 0),
        (211.88819518, 1625.67310022, 436.367, 0),
        (211.90709414, 1625.39410321, 442.673, 0),
        (212.00090348, 1623.42655008, 476.332, 0),
        (211.9249017, 1622.94540583, 482.63, 0),
        (212.34321938, 1616.32949197, 597.329, 0),
        (213.09638942, 1615.2869643, 610.4, 0),
        (219.41313491, 1580.22234313, 1197.332, 0),
        (220.38660128, 1579.20043302, 1210.37, 0),
        (236.35472669, 1542.30863041, 1798.267, 0),
        (237.41755384, 1541.41679119, 1810.383, 0),
        (264.08373622, 1502.66620597, 2398.244, 0),
        (265.65655239, 1501.64308908, 2410.443, 0),
        (304.66999824, 1460.94068336, 2997.263, 0),
        (306.22550945, 1459.75817211, 3010.38, 0),
        (358.88879764, 1416.472238, 3598.213, 0),
        (361.14046402, 1415.40942931, 3610.525, 0),
        (429.96379858, 1369.7972467, 4198.282, 0),
        (432.06565776, 1368.22265539, 4210.505, 0),
        (519.30493383, 1319.01141844, 4798.277, 0),
        (522.12134083, 1317.68234967, 4810.4, 0),
        (630.00294242, 1265.05368942, 5398.236, 0),
        (633.67624272, 1263.63633508, 5410.431, 0),
        (766.29767476, 1206.91262814, 5997.266, 0),
        (770.78300649, 1205.48393374, 6010.489, 0),
        (932.92308019, 1145.87780431, 6598.279, 0),
        (937.54373403, 1141.55438694, 6609.525, 0),
    ], dtype=[
        ('x', 'f8'), ('y', 'f8'), ('t', 'f8'), ('dmin', 'f8'),
    ]
)

# fyi my data comes as a structured array; unfortunately, simply passing
# data[['x', 'y']] to np.polyfit does not work. using
# pd.DataFrame(data[['x', y']]).values seems to be the easiest solution:
# 
coeffs = np.polyfit(
    data['t'], pd.DataFrame(data[['x', 'y']]).values, 3
)


def curve(t):
    # this can probably also be done in one statement, but I don't know how
    x = np.polyval(coeffs[:, 0], t)
    y = np.polyval(coeffs[:, 1], t)

    return x, y


def f(t, p):
    x, y = curve(t)
    return np.hypot(x - p['x'], y - p['y'])


# instead of this:
for point in data:
    tmin = fminbound(f, -50, 6659.525, args=(point, ))
    point['dmin'] = f(tmin, point)


# do something like this:
# tmin = fminbound(f, -50, 6659.525, args=(data, ))
# data['dmin'] = f(tmin, data)

但是如您所见,我使用 for 循环来计算每个数据点的最短距离,这会显着降低我的程序速度,因为它要执行数千次。因此,我想对操作进行矢量化/提高性能,但还没有找到方法。有与此相关的帖子(例如 here or here),但我不知道建议的解决方案如何适用于我的情况。

不,不可能对 fminbound 进行矢量化,因为它需要一个 一个变量的标量函数 。但是,您仍然可以通过重新制定底层数学优化问题来向量化循环:

N 标量优化问题

min f_1(t) s.t. t_l <= t <= t_u
min f_2(t) s.t. t_l <= t <= t_u
.
.
.
min f_N(t) s.t. t_l <= t <= t_u

对于标量函数f_i等价于一个优化问题 在 N 个变量中:

min f_1(t_1)**2 + ... + f_N(t_N)**2  s.t. t_l <= t_i <= t_u for all i = 1, .., N

可以通过scipy.optimize.minimize的方式解决。根据您的整个算法,您可以使用这种方法进一步消除更多循环,即您只解决一个大规模优化问题,而不是数千个标量优化问题。

清理代码后,可以按如下方式完成:

import numpy as np
from scipy.optimize import minimize

data = np.array([
    (212.82275865, 1650.40828168, 0., 0),
    (214.22056952, 1649.99898924, 10.38, 0),
    (212.86786868, 1644.25228805, 116.288, 0),
    (212.78680031, 1643.87461108, 122.884, 0),
    (212.57489485, 1642.01124032, 156.313, 0),
    (212.53483954, 1641.61858242, 162.618, 0),
    (212.43922274, 1639.58782771, 196.314, 0),
    (212.53726315, 1639.13842423, 202.619, 0),
    (212.2888428, 1637.24641296, 236.306, 0),
    (212.2722447, 1636.92307229, 242.606, 0),
    (212.15559302, 1635.0529813, 276.309, 0),
    (212.17535631, 1634.60618711, 282.651, 0),
    (212.02545613, 1632.72139574, 316.335, 0),
    (211.99988779, 1632.32053329, 322.634, 0),
    (211.33419846, 1631.07592039, 356.334, 0),
    (211.58972239, 1630.21971902, 362.633, 0),
    (211.70332122, 1628.2088542, 396.316, 0),
    (211.74610735, 1627.67591368, 402.617, 0),
    (211.88819518, 1625.67310022, 436.367, 0),
    (211.90709414, 1625.39410321, 442.673, 0),
    (212.00090348, 1623.42655008, 476.332, 0),
    (211.9249017, 1622.94540583, 482.63, 0),
    (212.34321938, 1616.32949197, 597.329, 0),
    (213.09638942, 1615.2869643, 610.4, 0),
    (219.41313491, 1580.22234313, 1197.332, 0),
    (220.38660128, 1579.20043302, 1210.37, 0),
    (236.35472669, 1542.30863041, 1798.267, 0),
    (237.41755384, 1541.41679119, 1810.383, 0),
    (264.08373622, 1502.66620597, 2398.244, 0),
    (265.65655239, 1501.64308908, 2410.443, 0),
    (304.66999824, 1460.94068336, 2997.263, 0),
    (306.22550945, 1459.75817211, 3010.38, 0),
    (358.88879764, 1416.472238, 3598.213, 0),
    (361.14046402, 1415.40942931, 3610.525, 0),
    (429.96379858, 1369.7972467, 4198.282, 0),
    (432.06565776, 1368.22265539, 4210.505, 0),
    (519.30493383, 1319.01141844, 4798.277, 0),
    (522.12134083, 1317.68234967, 4810.4, 0),
    (630.00294242, 1265.05368942, 5398.236, 0),
    (633.67624272, 1263.63633508, 5410.431, 0),
    (766.29767476, 1206.91262814, 5997.266, 0),
    (770.78300649, 1205.48393374, 6010.489, 0),
    (932.92308019, 1145.87780431, 6598.279, 0),
    (937.54373403, 1141.55438694, 6609.525, 0)])

# the coefficients
coeffs = np.polyfit(data[:, 2], data[:, 0:2], 3)

# the points
points = data[:, :2]

# vectorized version of your objective function
# i.e. evaluates f_1, ..., f_N
def f(t, points):
    poly_x = np.polyval(coeffs[:, 0], t)
    poly_y = np.polyval(coeffs[:, 1], t)
    return np.hypot(poly_x - points[:, 0], poly_y - points[:, 1])

# the scalar objective function we want to minimize
def obj_vec(t, points):
    vals = f(t, points)
    return np.sum(vals**2)

# variable bounds
bnds = [(-50, 6659.525)]*len(points)

# solve the optimization problem
res = minimize(lambda t: obj_vec(t, points), x0=np.zeros(len(points)), bounds=bnds)
dmins = f(res.x, points)

为了进一步加速优化,强烈建议将objective函数的精确梯度传递给minimize。目前,梯度是通过有限差分来逼近的,速度相当慢:

In [7]: %timeit res = minimize(lambda t: obj_vec(t, points), x0=np.zeros(len(points)), bounds=bnds)
91.5 ms ± 868 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)

由于梯度可以很容易地通过链式法则计算,我将把它留作 reader :) 的练习。 通过链式法则,梯度读作

def grad(t, points):
    poly_x = np.polyval(coeffs[:, 0], t)
    poly_y = np.polyval(coeffs[:, 1], t)
    poly_x_deriv = np.polyval(np.polyder(coeffs[:, 0], m=1), t)
    poly_y_deriv = np.polyval(np.polyder(coeffs[:, 1], m=1), t)
    return 2*poly_x_deriv*(poly_x - points[:, 0]) + 2*(poly_y_deriv)*(poly_y - points[:, 1])

并将其传递给 minimize 显着减少了运行时间:

In [9]: %timeit res = minimize(lambda t: obj_vec(t, points), jac=lambda t: grad(t, points), x0=np.zeros(len(points)),
     ...: bounds=bnds)
6.13 ms ± 63.2 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

最后,设置另一个起点也可能导致更少的迭代,正如您在评论中已经指出的那样。