使用最大似然估计器实现的曲线拟合不起作用

Curve fitting implemented using Maximum Likelihood Estimator implementations not working

我正在为曲线拟合的目的对离散计数数据实施最大似然估计,使用 curve_fit 的结果作为 minimize 的起点。我为多个分布定义并尝试了这些方法,但为了简单起见,我将只包括一个,即 logseries 分布。

在这一点上,我还尝试了 statsmodels 方法中的以下方法:

  1. statsmodels.discrete.discrete_model.fit
  2. statsmodels.discrete.count_model.fit
  3. statsmodels.base.model.GenericLikelihoodModel

大多数曲线拟合倾向于 运行 溢出错误或内部的 nans 和 zeros。我将在另一个 post

上详细说明这些错误
#Import a few packages
import numpy as np
from scipy.optimize import curve_fit
from scipy.optimize import minimize
from scipy import stats
from numpy import log
import numpy as np
import matplotlib.pyplot as plt

#Given data
x=np.arange(1, 28, 1)
y=np.array([18899, 10427, 6280, 4281, 2736, 1835, 1158, 746, 467, 328, 201, 129, 65, 69, 39, 21, 15, 10, 3, 3, 1, 1, 1, 1, 1, 1, 1])

#Define a custom distribution
def Logser(x, p): 
    return (-p**x)/(x*log(1-p))

#Doing a least squares curve fit
def lsqfit(x, y):
 cf_result = curve_fit(Logser, x, y, p0=0.7, bounds=(0.5,1), method='trf') 
 return cf_result

param_guess=lsqfit(x,y)[0][0]   
print(param_guess)

#Doing a custom MLE definition, minimized using the scipy minimize function

def MLERegression(param_guess):  
 yhat = Logser(x, param_guess) # predictions based on a parameter value
 sd=1 #initially guessed for fitting a normal distribution error around the regressed curve
# next, we flip the Bayesian question
# compute PDF of observed values normally distributed around mean (yhat)
# with a standard deviation of sd
 negLL = -np.sum( stats.norm.logpdf(y, loc=yhat, scale=sd) ) #log of the probability density function
 return negLL

results = minimize(MLERegression, param_guess, method='L-BFGS-B', bounds=(0.5,1.0), options={'disp': True})
final_param=results['x']
print(final_param)

我已经限制优化器给我的结果与我预期的相似(参数值大约为 0.8 或 0.9)。否则算法输出零

我认为这是由于缩放。当我通过添加比例因子将等式更改为 "scale * (-p**X)/(X * log(1-p))" 时,我在不使用任何边界的情况下得到以下值:p = 9.0360470735534726E-01 和 scale = 5.1189277041342692E+04 产生以下结果:

我的 p 拟合值确实是 0.9。