Python中回归函数的穷举优化算法

2024-04-23 23:43:27 发布

您现在位置:Python中文网/ 问答频道 /正文

我必须使用最小二乘法和略去交叉验证的方法来估计 回归均方误差: Regression Function

对于从1到30的p,找出p的最佳值是多少,以便执行最佳回归,定义p的最佳值是多少,以便构建回归函数。你知道吗

问题是我真的不知道怎么做。我了解所有的数学知识,我可以手工完成,我知道Python,但我有点像思维障碍。Scikit Learn有什么帮助吗?我知道他们有套索和脊特征选择,但这就像用手做一个特征选择,我需要一些东西来计算每一个p值的函数的权重,并计算它们的最小平方误差。问题是我的数据只有一个特征x,我正在申请几个p值。谢谢!你知道吗


Tags: 方法函数定义functionscikitlearn交叉思维
1条回答
网友
1楼 · 发布于 2024-04-23 23:43:27

我发现这个问题足以编写这个示例代码,它在我的一些测试数据上使用了你的公式-你需要用你自己的数据替换测试数据。这段代码至少可以让你开始。本例使用scipy.optimize.differential\u进化模块自动生成非线性求解器的初始参数估计,并用matplotlib绘制结果。你知道吗

import numpy, scipy, matplotlib
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
from scipy.optimize import differential_evolution
import warnings

xData = numpy.array([19.1647, 18.0189, 16.9550, 15.7683, 14.7044, 13.6269, 12.6040, 11.4309, 10.2987, 9.23465, 8.18440, 7.89789, 7.62498, 7.36571, 7.01106, 6.71094, 6.46548, 6.27436, 6.16543, 6.05569, 5.91904, 5.78247, 5.53661, 4.85425, 4.29468, 3.74888, 3.16206, 2.58882, 1.93371, 1.52426, 1.14211, 0.719035, 0.377708, 0.0226971, -0.223181, -0.537231, -0.878491, -1.27484, -1.45266, -1.57583, -1.61717])
yData = numpy.array([0.644557, 0.641059, 0.637555, 0.634059, 0.634135, 0.631825, 0.631899, 0.627209, 0.622516, 0.617818, 0.616103, 0.613736, 0.610175, 0.606613, 0.605445, 0.603676, 0.604887, 0.600127, 0.604909, 0.588207, 0.581056, 0.576292, 0.566761, 0.555472, 0.545367, 0.538842, 0.529336, 0.518635, 0.506747, 0.499018, 0.491885, 0.484754, 0.475230, 0.464514, 0.454387, 0.444861, 0.437128, 0.415076, 0.401363, 0.390034, 0.378698])


def func(x, B0, B1, B2, B3, B4, B2p1, B2p, p):
    returnVal = B0 # start with B0 and add the other terms

    returnVal += B1 * numpy.sin(2.0 * numpy.pi * x)
    returnVal += B2 * numpy.cos(2.0 * numpy.pi * x)

    returnVal += B3 * numpy.sin(2.0 * numpy.pi * 2.0 * x)
    returnVal += B4 * numpy.cos(2.0 * numpy.pi * 2.0 * x)

    returnVal += B2p1 * numpy.sin(2.0 * numpy.pi * p * x)
    returnVal += B2p * numpy.cos(2.0 * numpy.pi * p * x)

    return  returnVal


# function for genetic algorithm to minimize (sum of squared error)
def sumOfSquaredError(parameterTuple):
    warnings.filterwarnings("ignore") # do not print warnings by genetic algorithm
    val = func(xData, *parameterTuple)
    return numpy.sum((yData - val) ** 2.0)


def generate_Initial_Parameters():
    parameterBounds = []
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B0
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B1
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B2
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B3
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B4
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B2p1
    parameterBounds.append([-1.0, 1.0]) # seach bounds for B2p
    parameterBounds.append([-1.0, 1.0]) # seach bounds for p

    # "seed" the numpy random number generator for repeatable results
    result = differential_evolution(sumOfSquaredError, parameterBounds, seed=3)
    return result.x

# generate initial parameter values
geneticParameters = generate_Initial_Parameters()

# curve fit the test data
fittedParameters, pcov = curve_fit(func, xData, yData, geneticParameters)

print('Parameters', fittedParameters)
print()

modelPredictions = func(xData, *fittedParameters) 

absError = modelPredictions - yData

SE = numpy.square(absError) # squared errors
MSE = numpy.mean(SE) # mean squared errors
RMSE = numpy.sqrt(MSE) # Root Mean Squared Error, RMSE
Rsquared = 1.0 - (numpy.var(absError) / numpy.var(yData))
print('RMSE:', RMSE)
print('R-squared:', Rsquared)

print()


##########################################################
# graphics output section
def ModelAndScatterPlot(graphWidth, graphHeight):
    f = plt.figure(figsize=(graphWidth/100.0, graphHeight/100.0), dpi=100)
    axes = f.add_subplot(111)

    # first the raw data as a scatter plot
    axes.plot(xData, yData,  'D')

    # create data for the fitted equation plot
    xModel = numpy.linspace(min(xData), max(xData))
    yModel = func(xModel, *fittedParameters)

    # now the model as a line plot
    axes.plot(xModel, yModel)

    axes.set_xlabel('X Data') # X axis data label
    axes.set_ylabel('Y Data') # Y axis data label

    plt.show()
    plt.close('all') # clean up after using pyplot

graphWidth = 800
graphHeight = 600
ModelAndScatterPlot(graphWidth, graphHeight)

相关问题 更多 >