Python、Numpy与OLS
下面的代码可以正常运行,但并不是我想要的效果。我想把 c[1]
改成 c[1:]
,这样就可以对所有的 x 变量进行回归,而不仅仅是一个。当我做这个修改(并添加相应的 x 标签)后,出现了一个错误:ValueError: matrices are not aligned
。有人能解释一下为什么会这样吗?并且建议我怎么修改代码?谢谢。
from numpy import *
from ols import *
a = [[.001,.05,-.003,.014,.035,-.01,.032,-.0013,.0224,.005],[-.011,.012,.0013,.014,-.0015,.019,-.032,.013,-.04,-.05608],
[.0021,.02,-.023,.0024,.025,-.081,.032,-.0513,.00014,-.00015],[.001,.02,-.003,.014,.035,-.001,.032,-.003,.0224,-.005],
[.0021,-.002,-.023,.0024,.025,.01,.032,-.0513,.00014,-.00015],[-.0311,.012,.0013,.014,-.0015,.019,-.032,.013,-.014,-.008],
[.001,.02,-.0203,.014,.035,-.001,.00032,-.0013,.0224,.05],[.0021,-.022,-.0213,.0024,.025,.081,.032,.05313,.00014,-.00015],
[-.01331,.012,.0013,.014,.01015,.019,-.032,.013,-.014,-.012208],[.01021,-.022,-.023,.0024,.025,.081,.032,.0513,.00014,-.020015]]
c = column_stack(a)
y = c[0]
m = ols(y, c[1], y_varnm='y', x_varnm=['x1'])
print m.summary()
补充:我找到了一部分解决方案,但仍然有问题。下面的代码对 9 个解释变量中的 8 个有效。
c = column_stack(a)
y = c[0]
x = column_stack([c[i] for i in range(1, 9)])
m = ols(y, x, y_varnm='y', x_varnm=['x1','x2','x3','x4','x5','x6','x7','x8'])
print m.summary()
但是,当我尝试加入第 9 个 x 变量时,出现了一个错误:RuntimeWarning: divide by zero encountered in double_scalars
。有人知道这是为什么吗?这是代码(注意 len(a)
= 10):
c = column_stack(a)
y = c[0]
x = column_stack([c[i] for i in range(1, len(a))])
m = ols(y, x, y_varnm='y', x_varnm=['x1','x2','x3','x4','x5','x6','x7','x8','x9'])
print m.summary()
1 个回答
2
我对你使用的ols模块一点都不了解。不过,如果你试试下面这个用到的scikits.statsmodels,应该就能正常工作了:
import numpy as np
import scikits.statsmodels.api as sm
a = np.array([[.001,.05,-.003,.014,.035,-.01,.032,-.0013,.0224,.005],[-.011,.012,.0013,.014,-.0015,.019,-.032,.013,-.04,-.05608],
[.0021,.02,-.023,.0024,.025,-.081,.032,-.0513,.00014,-.00015],[.001,.02,-.003,.014,.035,-.001,.032,-.003,.0224,-.005],
[.0021,-.002,-.023,.0024,.025,.01,.032,-.0513,.00014,-.00015],[-.0311,.012,.0013,.014,-.0015,.019,-.032,.013,-.014,-.008],
[.001,.02,-.0203,.014,.035,-.001,.00032,-.0013,.0224,.05],[.0021,-.022,-.0213,.0024,.025,.081,.032,.05313,.00014,-.00015],
[-.01331,.012,.0013,.014,.01015,.019,-.032,.013,-.014,-.012208],[.01021,-.022,-.023,.0024,.025,.081,.032,.0513,.00014,-.020015]])
y = a[:, 0]
x = a[:, 1:]
results = sm.OLS(y, x).fit()
print results.summary()
输出结果:
Summary of Regression Results
=======================================
| Dependent Variable: ['y']|
| Model: OLS|
| Method: Least Squares|
| # obs: 10.0|
| Df residuals: 1.0|
| Df model: 8.0|
==============================================================================
| coefficient std. error t-statistic prob. |
------------------------------------------------------------------------------
| x0 0.2557 0.6622 0.3862 0.7654 |
| x1 0.03054 1.453 0.0210 0.9866 |
| x2 -3.392 2.444 -1.3877 0.3975 |
| x3 1.445 1.474 0.9808 0.5062 |
| x4 0.03559 0.2610 0.1363 0.9137 |
| x5 -0.7412 0.8754 -0.8467 0.5527 |
| x6 0.02289 0.2466 0.0928 0.9411 |
| x7 0.5754 1.413 0.4074 0.7537 |
| x8 -0.4827 0.7569 -0.6378 0.6386 |
==============================================================================
| Models stats Residual stats |
------------------------------------------------------------------------------
| R-squared: 0.8832 Durbin-Watson: 2.578 |
| Adjusted R-squared: -0.05163 Omnibus: 0.5325 |
| F-statistic: 0.9448 Prob(Omnibus): 0.7663 |
| Prob (F-statistic): 0.6663 JB: 0.1630 |
| Log likelihood: 41.45 Prob(JB): 0.9217 |
| AIC criterion: -64.91 Skew: 0.4037 |
| BIC criterion: -62.18 Kurtosis: 2.405 |
------------------------------------------------------------------------------