无法导入名称 NonlinearConstraint
cannot import name NonlinearConstraint
我正在尝试 运行 此处显示的具有非线性约束的优化示例
https://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html
>>> def cons_f(x):
... return [x[0]**2 + x[1], x[0]**2 - x[1]]
>>> def cons_J(x):
... return [[2*x[0], 1], [2*x[0], -1]]
>>> def cons_H(x, v):
... return v[0]*np.array([[2, 0], [0, 0]]) + v[1]*np.array([[2, 0], [0, 0]])
>>> from scipy.optimize import NonlinearConstraint
>>> nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1, jac=cons_J, hess=cons_H)
但是当我尝试导入时 NonlinearConstraint
这就是我得到的结果
ImportError: cannot import name NonlinearConstraint
我是 运行ning scipy v.1.0.0
>>> import scipy
>>> print scipy.__version__
1.0.0
有什么建议吗?预先感谢您的帮助
您将需要 scipy >= 1.1
或基于 master-branch 的安装!
由于 1.1 最近发布 (05.05.18),因此有机会进行二进制构建(取决于您如何使用 scipy)。
...
from ._lsq import least_squares, lsq_linear
from ._constraints import (NonlinearConstraint,
LinearConstraint,
Bounds)
from ._hessian_update_strategy import HessianUpdateStrategy, BFGS, SR1
__all__ = [s for s in dir() if not s.startswith('_')]
...
...
from ._lsq import least_squares, lsq_linear
__all__ = [s for s in dir() if not s.startswith('_')]
...
1.1 release-text 中提供了更多指示:
scipy.optimize improvements
The method trust-constr has been added to scipy.optimize.minimize. The
method switches between two implementations depending on the problem
definition. For equality constrained problems it is an implementation of
a trust-region sequential quadratic programming solver and, when
inequality constraints are imposed, it switches to a trust-region
interior point method. Both methods are appropriate for large scale
problems. Quasi-Newton options BFGS and SR1 were implemented and can be
used to approximate second order derivatives for this new method. Also,
finite-differences can be used to approximate either first-order or
second-order derivatives.
这实际上是引入这些抽象的求解器。
此外,optimize/_constraints.py
在 1.01 中不存在。
我正在尝试 运行 此处显示的具有非线性约束的优化示例
https://docs.scipy.org/doc/scipy/reference/tutorial/optimize.html
>>> def cons_f(x):
... return [x[0]**2 + x[1], x[0]**2 - x[1]]
>>> def cons_J(x):
... return [[2*x[0], 1], [2*x[0], -1]]
>>> def cons_H(x, v):
... return v[0]*np.array([[2, 0], [0, 0]]) + v[1]*np.array([[2, 0], [0, 0]])
>>> from scipy.optimize import NonlinearConstraint
>>> nonlinear_constraint = NonlinearConstraint(cons_f, -np.inf, 1, jac=cons_J, hess=cons_H)
但是当我尝试导入时 NonlinearConstraint
这就是我得到的结果
ImportError: cannot import name NonlinearConstraint
我是 运行ning scipy v.1.0.0
>>> import scipy
>>> print scipy.__version__
1.0.0
有什么建议吗?预先感谢您的帮助
您将需要 scipy >= 1.1
或基于 master-branch 的安装!
由于 1.1 最近发布 (05.05.18),因此有机会进行二进制构建(取决于您如何使用 scipy)。
...
from ._lsq import least_squares, lsq_linear
from ._constraints import (NonlinearConstraint,
LinearConstraint,
Bounds)
from ._hessian_update_strategy import HessianUpdateStrategy, BFGS, SR1
__all__ = [s for s in dir() if not s.startswith('_')]
...
...
from ._lsq import least_squares, lsq_linear
__all__ = [s for s in dir() if not s.startswith('_')]
...
1.1 release-text 中提供了更多指示:
scipy.optimize improvements
The method trust-constr has been added to scipy.optimize.minimize. The method switches between two implementations depending on the problem definition. For equality constrained problems it is an implementation of a trust-region sequential quadratic programming solver and, when inequality constraints are imposed, it switches to a trust-region interior point method. Both methods are appropriate for large scale problems. Quasi-Newton options BFGS and SR1 were implemented and can be used to approximate second order derivatives for this new method. Also, finite-differences can be used to approximate either first-order or second-order derivatives.
这实际上是引入这些抽象的求解器。
此外,optimize/_constraints.py
在 1.01 中不存在。