sklearn.linear_model.ridge中的统计汇总表?

2024-05-15 00:28:11 发布

您现在位置:Python中文网/ 问答频道 /正文

在OLS形式的StatsModels中,results.summary显示回归结果的摘要(如AIC、BIC、R平方等)。

在sklearn.linear_model.ridge中是否可以使用此汇总表?

如果有人指导我,我真的很感激。谢谢您。


Tags: modelaicsklearnsummaryresults形式linear指导
1条回答
网友
1楼 · 发布于 2024-05-15 00:28:11

据我所知,sklearn中没有类似R(或Statsmodels)的摘要表。(请检查this answer

相反,如果您需要它,这里有statsmodels.regression.linear_model.OLS.fit_regularized类。(L1_wt=0用于岭回归。)

目前看来,尽管下面有docstring,但model.fit_regularized(~).summary()返回None。但是这个对象有paramssummary()可以以某种方式使用。

Returns: A RegressionResults object, of the same type returned by fit.

示例。

样本数据不是岭回归,但我还是会尝试。

在。

import numpy as np
import pandas as pd
import statsmodels
import statsmodels.api as sm
import matplotlib.pyplot as plt

statsmodels.__version__

出去。

'0.8.0rc1'

在。

data = sm.datasets.ccard.load()

print "endog: " + data.endog_name
print "exog: " + ', '.join(data.exog_name)

data.exog[:5, :]

出去。

endog: AVGEXP
exog: AGE, INCOME, INCOMESQ, OWNRENT
Out[2]:
array([[ 38.    ,   4.52  ,  20.4304,   1.    ],
       [ 33.    ,   2.42  ,   5.8564,   0.    ],
       [ 34.    ,   4.5   ,  20.25  ,   1.    ],
       [ 31.    ,   2.54  ,   6.4516,   0.    ],
       [ 32.    ,   9.79  ,  95.8441,   1.    ]])

在。

y, X = data.endog, data.exog

model = sm.OLS(y, X)
results_fu = model.fit()

print results_fu.summary()

出去。

                            OLS Regression Results                            
==============================================================================
Dep. Variable:                      y   R-squared:                       0.543
Model:                            OLS   Adj. R-squared:                  0.516
Method:                 Least Squares   F-statistic:                     20.22
Date:                Wed, 19 Oct 2016   Prob (F-statistic):           5.24e-11
Time:                        17:22:48   Log-Likelihood:                -507.24
No. Observations:                  72   AIC:                             1022.
Df Residuals:                      68   BIC:                             1032.
Df Model:                           4                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [0.025      0.975]
------------------------------------------------------------------------------
x1            -6.8112      4.551     -1.497      0.139     -15.892       2.270
x2           175.8245     63.743      2.758      0.007      48.628     303.021
x3            -9.7235      6.030     -1.613      0.111     -21.756       2.309
x4            54.7496     80.044      0.684      0.496    -104.977     214.476
==============================================================================
Omnibus:                       76.325   Durbin-Watson:                   1.692
Prob(Omnibus):                  0.000   Jarque-Bera (JB):              649.447
Skew:                           3.194   Prob(JB):                    9.42e-142
Kurtosis:                      16.255   Cond. No.                         87.5
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

在。

frames = []
for n in np.arange(0, 0.25, 0.05).tolist():
    results_fr = model.fit_regularized(L1_wt=0, alpha=n, start_params=results_fu.params)

    results_fr_fit = sm.regression.linear_model.OLSResults(model, 
                                                           results_fr.params, 
                                                           model.normalized_cov_params)
    frames.append(np.append(results_fr.params, results_fr_fit.ssr))

    df = pd.DataFrame(frames, columns=data.exog_name + ['ssr*'])
df.index=np.arange(0, 0.25, 0.05).tolist()
df.index.name = 'alpha*'
df.T

出去。

enter image description here

在。

%matplotlib inline

fig, ax = plt.subplots(1, 2, figsize=(14, 4))

ax[0] = df.iloc[:, :-1].plot(ax=ax[0])
ax[0].set_title('Coefficient')

ax[1] = df.iloc[:, -1].plot(ax=ax[1])
ax[1].set_title('SSR')

出去。

enter image description here

在。

results_fr = model.fit_regularized(L1_wt=0, alpha=0.04, start_params=results_fu.params)
final = sm.regression.linear_model.OLSResults(model, 
                                              results_fr.params, 
                                              model.normalized_cov_params)

print final.summary()

出去。

                            OLS Regression Results                            
==============================================================================
Dep. Variable:                      y   R-squared:                       0.543
Model:                            OLS   Adj. R-squared:                  0.516
Method:                 Least Squares   F-statistic:                     20.17
Date:                Wed, 19 Oct 2016   Prob (F-statistic):           5.46e-11
Time:                        17:22:49   Log-Likelihood:                -507.28
No. Observations:                  72   AIC:                             1023.
Df Residuals:                      68   BIC:                             1032.
Df Model:                           4                                         
Covariance Type:            nonrobust                                         
==============================================================================
                 coef    std err          t      P>|t|      [0.025      0.975]
------------------------------------------------------------------------------
x1            -5.6375      4.554     -1.238      0.220     -14.724       3.449
x2           159.1412     63.781      2.495      0.015      31.867     286.415
x3            -8.1360      6.034     -1.348      0.182     -20.176       3.904
x4            44.2597     80.093      0.553      0.582    -115.564     204.083
==============================================================================
Omnibus:                       76.819   Durbin-Watson:                   1.694
Prob(Omnibus):                  0.000   Jarque-Bera (JB):              658.948
Skew:                           3.220   Prob(JB):                    8.15e-144
Kurtosis:                      16.348   Cond. No.                         87.5
==============================================================================

Warnings:
[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.

相关问题 更多 >

    热门问题