scipy.optimise TypeError - 采用 1 个位置参数,但 2 个是通过外部函数给出的
scipy.optimise TypeError - takes 1 positional argument but 2 were given with external functions
我正在尝试加快由 scipy 最小化调用的函数。它们最初都是 lambda,所以我想我会用 numba @njit 函数替换它们。
但我得到这个例外:
File "/blah/opt.py", line 142, in normalise
result = minimize(
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_minimize.py", line 631, in minimize
return _minimize_slsqp(fun, x0, args, jac, bounds,
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/slsqp.py", line 375, in _minimize_slsqp
sf = _prepare_scalar_function(func, x, jac=jac, args=args, epsilon=eps,
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/optimize.py", line 261, in _prepare_scalar_function
sf = ScalarFunction(fun, x0, args, grad, hess,
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 159, in __init__
self._update_grad()
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 238, in _update_grad
self._update_grad_impl()
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 149, in update_grad
self.g = grad_wrapped(self.x)
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 146, in grad_wrapped
return np.atleast_1d(grad(np.copy(x), *args))
TypeError: <lambda>() takes 1 positional argument but 2 were given
这里是使用的代码:
@njit(cache=True)
def fn(x, weights):
return np.sum((x - weights) ** 2)
@njit(cache=True)
def fn_cons(x):
return np.sum(np.abs(x)) - 1
cons = ({'type': 'eq',
'fun': fn_cons
})
class TestSpeedup:
def normalise(self, weights):
result = minimize(
fn,
np.array(weights),
args=(weights,),
jac=lambda x: 2 * (x - weights),
bounds=[(0, np.infty) for _ in weights],
constraints=cons
)
minimum = result.x
# return np.max([new_weights, np.zeros(new_weights.size)], axis=0) / np.sum(np.max([new_weights, np.zeros(new_weights.size)], axis=0))
return minimum / np.sum(np.abs(minimum))
weights = np.array([ 1.04632843e+00, -6.89001783e-02, 2.17089646e-01, -2.52113073e-01, 4.19467585e-03])
test = TestSpeedup()
result = test.normalise(weights)
函数在 class 之外,所以第一个参数不是 self。所以不确定我在这里缺少什么?有什么建议吗?
jacobian 函数使用与 objective 函数相同的参数调用,因此您应该像这样重写 lambda,例如:
lambda x, w: 2 * (x - w)
相反,您可以重写 objective 函数,以便它也计算雅可比矩阵,在对 minimize()
:
的调用中指定参数 jac=True
@njit(cache=True)
def fn(x, weights):
d = x - weights
err = d @ d
jac = 2 * d
return err, jac
我正在尝试加快由 scipy 最小化调用的函数。它们最初都是 lambda,所以我想我会用 numba @njit 函数替换它们。
但我得到这个例外:
File "/blah/opt.py", line 142, in normalise
result = minimize(
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_minimize.py", line 631, in minimize
return _minimize_slsqp(fun, x0, args, jac, bounds,
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/slsqp.py", line 375, in _minimize_slsqp
sf = _prepare_scalar_function(func, x, jac=jac, args=args, epsilon=eps,
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/optimize.py", line 261, in _prepare_scalar_function
sf = ScalarFunction(fun, x0, args, grad, hess,
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 159, in __init__
self._update_grad()
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 238, in _update_grad
self._update_grad_impl()
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 149, in update_grad
self.g = grad_wrapped(self.x)
File "/blah/venv/lib/python3.8/site-packages/scipy/optimize/_differentiable_functions.py", line 146, in grad_wrapped
return np.atleast_1d(grad(np.copy(x), *args))
TypeError: <lambda>() takes 1 positional argument but 2 were given
这里是使用的代码:
@njit(cache=True)
def fn(x, weights):
return np.sum((x - weights) ** 2)
@njit(cache=True)
def fn_cons(x):
return np.sum(np.abs(x)) - 1
cons = ({'type': 'eq',
'fun': fn_cons
})
class TestSpeedup:
def normalise(self, weights):
result = minimize(
fn,
np.array(weights),
args=(weights,),
jac=lambda x: 2 * (x - weights),
bounds=[(0, np.infty) for _ in weights],
constraints=cons
)
minimum = result.x
# return np.max([new_weights, np.zeros(new_weights.size)], axis=0) / np.sum(np.max([new_weights, np.zeros(new_weights.size)], axis=0))
return minimum / np.sum(np.abs(minimum))
weights = np.array([ 1.04632843e+00, -6.89001783e-02, 2.17089646e-01, -2.52113073e-01, 4.19467585e-03])
test = TestSpeedup()
result = test.normalise(weights)
函数在 class 之外,所以第一个参数不是 self。所以不确定我在这里缺少什么?有什么建议吗?
jacobian 函数使用与 objective 函数相同的参数调用,因此您应该像这样重写 lambda,例如:
lambda x, w: 2 * (x - w)
相反,您可以重写 objective 函数,以便它也计算雅可比矩阵,在对 minimize()
:
jac=True
@njit(cache=True)
def fn(x, weights):
d = x - weights
err = d @ d
jac = 2 * d
return err, jac