调用时适配任意数量的参数curve_fit

Fit an arbitrary number of parameters when calling curve_fit

我发现最接近这个问题的是:Fitting only one parameter of a function with many parameters in python。我有一个多参数函数,我希望能够使用在代码的不同部分优化的不同参数子集调用它(这很有用,因为对于某些数据集,我可能能够根据辅助数据修复一些参数)。以下问题的简化演示。

from scipy.optimize import curve_fit
import numpy as np

def wrapper_func(**kwargs):
    a = kwargs['a'] if 'a' in kwargs else None
    b = kwargs['b'] if 'b' in kwargs else None
    c = kwargs['c'] if 'c' in kwargs else None
return lambda x, a, c: func(x, a, b, c)

def func(x, a, b, c):
    return a * x**2 + b * x + c

# Set parameters    
a = 0.3
b = 5
c = 17 

# Make some fake data
x_vals = np.arange(100)
y_vals = a * x_vals**2 + b * x_vals + c
noise = np.random.randn(100) * 20

# Get fit
popt, pcov = curve_fit(lambda x, a_, c_: func(x, a_, b, c_), 
                       x_vals, y_vals + noise)

# Get fit using separate function
alt_popt, alt_cov = curve_fit(wrapper_func(b=5), x_vals, y_vals + noise)

所以这可行,但我希望能够传递要修复的任意参数组合。所以这里参数 a 和 c 被优化,b 是固定的,但是如果我想固定 a 并优化 b 和 c(或任何其他组合),有没有办法巧妙地做到这一点?我从上面的 wrapper_func() 开始,但出现了同样的问题:似乎没有办法改变优化的参数,除非编写多个 lambda(以传递的固定参数值为条件)。这很快就会变得很难看,因为我正在使用的方程式有 4-6 个参数。我可以使用 eval 使版本工作,但不推荐这样做。就目前而言,我一直在摸索着尝试将 *args 与 lambda 一起使用,但还没有设法让它发挥作用。 非常感谢任何提示!

这是我的解决方案。我不确定如何使用 curve_fit 来完成它,但它适用于 leastsq。它有一个包装函数,它接受自由和固定参数以及自由参数位置列表。由于 leastsq 首先使用自由参数调用函数,因此,包装器必须重新排列顺序。

from matplotlib import pyplot as plt
import numpy as np
from scipy.optimize import leastsq

def func(x,a,b,c,d,e):
    return a+b*x+c*x**2+d*x**3+e*x**4

#takes x, the 5 parameters and a list
# the first n parameters are free
# the list of length n gives there position, e.g. 2  parameters, 1st and 3rd order ->[1,3]
# the remaining parameters are in order, i.e. in this example it would be f(x,b,d,a,c,e)
def expand_parameters(*args):
    callArgs=args[1:6]
    freeList=args[-1]
    fixedList=range(5)
    for item in freeList:
        fixedList.remove(item)
    callList=[0,0,0,0,0]
    for val,pos in zip(callArgs, freeList+fixedList):
        callList[pos]=val
    return func(args[0],*callList)

def residuals(parameters,dataPoint,fixedParameterValues=None,freeParametersPosition=None):
    if fixedParameterValues is None:
        a,b,c,d,e = parameters
        dist = [y -func(x,a,b,c,d,e) for x,y in dataPoint] 
    else:
        assert len(fixedParameterValues)==5-len(freeParametersPosition)
        assert len(fixedParameterValues)>0
        assert len(fixedParameterValues)<5 # doesn't make sense to fix all
        extraIn=list(parameters)+list(fixedParameterValues)+[freeParametersPosition]
        dist = [y -expand_parameters(x,*extraIn) for x,y in dataPoint]
    return dist


if __name__=="__main__":
    xList=np.linspace(-1,3,15)
    fList=np.fromiter( (func(s,1.1,-.9,-.7,.5,.1) for s in xList), np.float)

    fig=plt.figure()
    ax=fig.add_subplot(1,1,1)

    dataTupel=zip(xList,fList)

    ###some test
    print residuals([1.1,-.9,-.7,.5,.1],dataTupel)
    print residuals([1.1,-.9,-.7,.5],dataTupel,fixedParameterValues=[.1],freeParametersPosition=[0,1,2,3])

    #exact fit
    bestFitValuesAll, ier = leastsq(residuals, [1,1,1,1,1],args=(dataTupel))
    print bestFitValuesAll

    ###Only a constant
    guess=[1]
    bestFitValuesConstOnly, ier = leastsq(residuals, guess,args=(dataTupel,[0,0,0,0],[0]))
    print bestFitValuesConstOnly
    fConstList=np.fromiter(( func(x,*np.append(bestFitValuesConstOnly,[0,0,0,0])) for x in xList),np.float)

    ###Only 2nd and 4th
    guess=[1,1]
    bestFitValues_1_3, ier = leastsq(residuals, guess,args=(dataTupel,[0,0,0],[2,4]))
    print bestFitValues_1_3
    f_1_3_List=np.fromiter(( expand_parameters(x, *(list(bestFitValues_1_3)+[0,0,0]+[[2,4]] ) )  for x in xList),np.float)


    ###Only 2nd and 4th with closer values
    guess=[1,1]
    bestFitValues_1_3_closer, ier = leastsq(residuals, guess,args=(dataTupel,[1.2,-.8,0],[2,4]))
    print bestFitValues_1_3_closer
    f_1_3_closer_List=np.fromiter(( expand_parameters(x, *(list(bestFitValues_1_3_closer)+[1.2,-.8,0]+[[2,4]] ) )  for x in xList),np.float)


    ax.plot(xList,fList,linestyle='',marker='o',label='orig')
    ax.plot(xList,fConstList,linestyle='',marker='o',label='0')
    ax.plot(xList,f_1_3_List,linestyle='',marker='o',label='1,3')
    ax.plot(xList,f_1_3_closer_List,linestyle='',marker='o',label='1,3 c')

    ax.legend(loc=0)

    plt.show()

提供:

>>[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
>>[0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
>>[ 1.1 -0.9 -0.7  0.5  0.1]
>>[ 2.64880466]
>>[-0.14065838  0.18305123]
>>[-0.31708629  0.2227272 ]

lmfit (https://lmfit.github.io/lmfit-py/) 正是这样做的。不是为拟合中的参数创建一个浮点值数组,而是创建一个 Parameters 对象——用于参数化数据模型的 Parameter 对象的有序字典。每个参数在拟合中可以固定或变化,可以有 max/min 边界,或者可以根据拟合中的其他参数定义为简单的数学表达式。

也就是说,使用 lmfit(及其对曲线拟合特别有用的模型 class),可以创建参数,然后可以决定哪些将被优化,哪些将保持固定。

例如,这是您提出的问题的变体:

import numpy as np
from lmfit import Model
import matplotlib.pylab as plt

# starting parameters
a, b, c = 0.3, 5, 17
x_vals = np.arange(100)
noise = np.random.normal(size=100, scale=0.25)
y_vals = a * x_vals**2 + b * x_vals + c + noise

def func(x, a, b, c):
    return a * x**2 + b * x + c

# create a Model from this function
model = Model(func)

# create parameters with initial values. Model will know to 
# turn function args `a`, `b`, and `c` into Parameters:
params = model.make_params(a=0.25, b=4, c=10)

# you can alter each parameter, for example, fix b or put bounds on a
params['b'].vary = False
params['b'].value = 5.3
params['a'].min = -1
params['a'].max =  1

# run fit
result = model.fit(y_vals, params, x=x_vals)

# print and plot results
print(result.fit_report())
result.plot(datafmt='--')
plt.show()

将打印出:

[[Model]]
    Model(func)
[[Fit Statistics]]
    # function evals   = 12
    # data points      = 100
    # variables        = 2
    chi-square         = 475.843
    reduced chi-square = 4.856
    Akaike info crit   = 159.992
    Bayesian info crit = 165.202
[[Variables]]
    a:   0.29716481 +/- 7.46e-05 (0.03%) (init= 0.25)
    b:   5.3 (fixed)
    c:   11.4708897 +/- 0.329508 (2.87%) (init= 10)
[[Correlations]] (unreported correlations are <  0.100)
    C(a, c)                      = -0.744 

(你会发现 bc 高度负相关)并显示如下图

此外,包含参数的拟合结果保存在result中,所以如果你想改变固定的参数,你可以简单地改变起始值(拟合没有更新) :

params['b'].vary = True
params['a'].value = 0.285
params['a'].vary = False

newresult = model.fit(y_vals, params, x=x_vals)

然后compare/contrast两个结果。