如何在 Newton-CG 方法中雅可比行列式达到任意(小)值时停止迭代?

How to stop the iteration when the Jacobian reached to an arbitrary (small) value in Newton-CG method?

如何为 Newton-CG 方法设置雅可比(或梯度)的停止条件? 我希望算法在 jacobian 达到 1e-2 时停止,可以使用 Newton-CG 吗??

输入:

scipy.optimize.minimize(f, [5.0,1.0,2.0,5.0], args=Data, method='Newton-CG',jac=Jacf)

输出:

     jac: array([7.64265411e-08, 1.74985718e-08, 4.12408407e-07, 5.02972841e-08])
 message: 'Optimization terminated successfully.'
    nfev: 12
    nhev: 0
     nit: 11
    njev: 68
  status: 0
 success: True
       x: array([0.22545395, 0.3480084 , 1.06811724, 1.64873479])

在类似于Newton-CG的BFGS方法中,有一个gtol选项,它允许在梯度达到某个值时停止迭代。但是在 Newton-CG 中没有这种选项。

有谁知道当 jacobien 达到 1e-2 时如何停止迭代。

这里有一些重现我的代码的细节:

def convert_line2matrix(a):
    n = len(a)
    if (np.sqrt(n) % 1 == 0) :
        d = int(np.sqrt(n))
        Mat = np.zeros((d,d))
        for i in range(d):
            for j in range(d):
                Mat[i,j] = a[j+d*i] 
    else:
        raise  ValueError(f"{a} cant be converted into a (n x n) matrix. The array has {len(a)} elements, \n\t    thus impossible to build a square matrix with {len(a)} elements.")
    return Mat

def convert_matrix2line(Matrix):
    result = []
    dim = len(Matrix) 
    for i in range(dim):
        for j in range(dim):
            result.append(Matrix[i,j])
    return np.array(result)


my_data = np.array([[0.21530249, 0.32450331, 0        ],
       [0.1930605 , 0.31788079, 0        ],
       [0.17793594, 0.31788079, 0        ],
       [0.16459075, 0.31125828, 1        ],
       [0.24822064, 0.31125828, 0        ],
       [0.28647687, 0.32450331, 0        ],
       [0.32829181, 0.31788079, 0        ],
       [0.38879004, 0.32450331, 0        ],
       [0.42882562, 0.32450331, 0        ],
       [0.47419929, 0.32450331, 0        ],
       [0.5044484 , 0.32450331, 0        ],
       [0.1797153 , 0.31125828, 0        ],
       [0.16548043, 0.31125828, 1        ],
       [0.17793594, 0.29801325, 1        ],
       [0.1930605 , 0.31788079, 0        ]])

Data = pd.DataFrame(my_data, columns=['X_1','X_2', 'Allum'])



def logLB(params,Data):
    B = convert_line2matrix(params)
    X = np.array(Data.iloc[:,:len(B)]) 
    Y = np.array(Data.iloc[:,len(B)])

    result = 0
    n = len(Data)
    BB = np.transpose(B) @ B
    for i in range(n):
        if(1-np.exp(-X[i].T @ BB @ X[i]) > 0):
            result += Y[i]*(-np.transpose(X[i]) @ BB @ X[i]) + (1 - Y[i])*np.log(1-np.exp(-X[i].T @ BB @ X[i]))

    return result

def f(params, Data):
    return -logLB(params, Data)



def dlogLB(params, Data):
    B = convert_line2matrix(params)
    X = np.array(Data.iloc[:,:len(B)]) 
    Y = np.array(Data.iloc[:,len(B)])
 
    BB = B.T @ B
    N = len(Data)
    M = len(B)
    Jacobian = np.zeros(np.shape(B))
    for n in range(len(B)):
        for m in range(len(B)):
            result = 0
            for c in range(N):
                som = 0
                for i in range(M):
                    som += X[c,m]*B[n,i]*X[c,i]
                if (1 - np.exp(-X[c].T @ BB @ X[c]) > 0):
                    result += -2*Y[c]*som + (1-Y[c])*np.exp(-X[c].T @ BB @ X[c])*(2*som)/(1 - np.exp(-X[c].T @ BB @ X[c]))

                Jacobian[n,m] = result 

    return convert_matrix2line(Jacobian)

def Jacf(params, Data):
    return -dlogLB(params, Data)

我假设你想在梯度的欧几里得范数一达到特定值就停止优化器,这正是BFGS方法的[=13]的意思=] 选项。否则,它在数学上没有任何意义,因为评估的梯度是一个向量,因此不能与标量值进行比较。

Newton-CG 方法没有提供类似的选项。但是,您可以使用在每次迭代后调用的简单回调,并在回调 returns True 时终止算法。不幸的是,您只能通过使用 trust-constr 方法的回调来终止优化器。对于所有其他方法,回调的 return 值将被忽略,因此非常有限。

无论如何,通过回调终止优化器的一种可能的 hacky 和丑陋的方式将引发异常:

import numpy as np
from scipy.optimize import minimize

class Callback:
    def __init__(self, eps, args, jac):
        self.eps = eps
        self.args = args
        self.jac = jac
        self.x = None
        self.gtol = None

    def __call__(self, xk):
        self.x = xk
        self.gtol = np.linalg.norm(self.jac(xk, *self.args))
        if self.gtol <= self.eps:
            raise Exception("Gradient norm is below threshold")

这里,xk 是当前迭代,eps 你想要的公差,args 一个包含你可选的 objective 和梯度参数的元组和 jac梯度。然后,你可以这样使用它:

from scipy.optimize import minimize

cb = Callback(1.0e-1, (Data,), Jacf)

try:
    res = minimize(f, [5.0,1.0,2.0,5.0], args=Data, method='Newton-CG', 
    jac=Jacf, callback=cb)
except:
    x = cb.x
    gtol = cb.gtol
    print(f"gtol = {gtol:E}, x = {x}")

产生

gtol = 5.515263E-02, x = [14.43322108 -5.18163542  0.22582261 -0.04859385]