scipy.optimize.least_squares 中(旧版)scipy.optimize.leastsq 的 cov_x 等价物

Equivalent of cov_x from (legacy) scipy.optimize.leastsq in scipy.optimize.least_squares

遗留 scipy.optimize.leastsq 函数 returns 一个 cov_x 参数:

cov_x: ndarray

Uses the fjac and ipvt optional outputs to construct an estimate of the jacobian around the solution. None if a singular matrix encountered (indicates very flat curvature in some direction). This matrix must be multiplied by the residual variance to get the covariance of the parameter estimates – see curve_fit.

有助于估计参数估计的方差。

这个参数在新的 scipy.optimize.least_squares 中相当于什么?有:

jac : ndarray, sparse matrix or LinearOperator, shape (m, n)

Modified Jacobian matrix at the solution, in the sense that J^T J is a Gauss-Newton approximation of the Hessian of the cost function. The type is the same as the one used by the algorithm.

但实际上并不等同。

我认为没有明显的等价物。 jac 不一样。它是 Jacobian 矩阵的估计值,Jacobian 矩阵是用于计算优化最小结果的梯度的导数矩阵。

您可以使用 curve_fit 执行最小二乘回归,这将 return 一个协方差矩阵。

Returns:

popt : array Optimal values for the parameters so that the sum of the squared residuals of f(xdata, *popt) - ydata is minimized

pcov : 2d array The estimated covariance of popt. The diagonals provide the variance of the parameter estimate. To compute one standard deviation errors on the parameters use perr = np.sqrt(np.diag(pcov)). How the sigma parameter affects the estimated covariance depends on absolute_sigma argument, as described above. If the Jacobian matrix at the solution doesn’t have a full rank, then ‘lm’ method returns a matrix filled with np.inf, on the other hand ‘trf’ and ‘dogbox’ methods use Moore-Penrose pseudoinverse to compute the covariance matrix.

另见 scipy.stats.lingress,它也计算最小二乘和 return 与协方差相关的相关系数。