Python中OLS的Newey-West标准错误?

Python中OLS的Newey-West标准错误?,第1张

Python中OLS的Newey-West标准错误?

编辑(10/31/2015)以反映

statsmodels
2015年秋季的首选编码样式

statsmodels
0.6.1版中,您可以执行以下 *** 作:

import pandas as pdimport numpy as npimport statsmodels.formula.api as smfdf = pd.Dataframe({'a':[1,3,5,7,4,5,6,4,7,8,9],        'b':[3,5,6,2,4,6,7,8,7,8,9]})reg = smf.ols('a ~ 1 + b',data=df).fit(cov_type='HAC',cov_kwds={'maxlags':1})print reg.summary()          OLS Regression Results==============================================================================Dep. Variable:a   R-squared: 0.281Model:      OLS   Adj. R-squared:       0.201Method:      Least Squares   F-statistic:          1.949Date:     Sat, 31 Oct 2015   Prob (F-statistic):   0.196Time:  03:15:46   Log-Likelihood:     -22.603No. Observations:       11   AIC:       49.21Df Residuals: 9   BIC:       50.00Df Model:     1Covariance Type:       HAC==============================================================================      coef    std err          z      P>|z|      [95.0% Conf. Int.]------------------------------------------------------------------------------Intercept      2.0576      2.661      0.773      0.439        -3.157     7.272b   0.5595      0.401      1.396      0.163        -0.226     1.345==============================================================================Omnibus:  0.361   Durbin-Watson:        1.468Prob(Omnibus):       0.835   Jarque-Bera (JB):     0.331Skew:     0.321   Prob(JB):  0.847Kurtosis: 2.442   Cond. No.   19.1==============================================================================Warnings:[1] Standard Errors are heteroscedasticity and autocorrelation robust (HAC) using 1 lags and without small sample correction

或者您可以

get_robustcov_results
在拟合模型后使用该方法

reg = smf.ols('a ~ 1 + b',data=df).fit()new = reg.get_robustcov_results(cov_type='HAC',maxlags=1)print new.summary()          OLS Regression Results==============================================================================Dep. Variable:a   R-squared: 0.281Model:      OLS   Adj. R-squared:       0.201Method:      Least Squares   F-statistic:          1.949Date:     Sat, 31 Oct 2015   Prob (F-statistic):   0.196Time:  03:15:46   Log-Likelihood:     -22.603No. Observations:       11   AIC:       49.21Df Residuals: 9   BIC:       50.00Df Model:     1Covariance Type:       HAC==============================================================================      coef    std err          z      P>|z|      [95.0% Conf. Int.]------------------------------------------------------------------------------Intercept      2.0576      2.661      0.773      0.439        -3.157     7.272b   0.5595      0.401      1.396      0.163        -0.226     1.345==============================================================================Omnibus:  0.361   Durbin-Watson:        1.468Prob(Omnibus):       0.835   Jarque-Bera (JB):     0.331Skew:     0.321   Prob(JB):  0.847Kurtosis: 2.442   Cond. No.   19.1==============================================================================Warnings:[1] Standard Errors are heteroscedasticity and autocorrelation robust (HAC) using 1 lags and without small sample correction

默认

statsmodels
值与中等效方法的默认值略有不同
R
。通过将调用更改为以下内容,
R
可以使该方法等效于
statsmodels
默认方法(如上所述)
vcov,

temp.summ$coefficients <- unclass(coeftest(temp.lm,     vcov. = NeweyWest(temp.lm,lag=1,prewhite=FALSE)))print (temp.summ$coefficients)  Estimate Std. Error   t value  Pr(>|t|)(Intercept) 2.0576208  2.6605060 0.7733945 0.4591196b0.5594796  0.4007965 1.3959193 0.1962142

您仍然可以在熊猫(0.17)中执行Newey-West,尽管我认为该计划将在熊猫中弃用OLS:

print pd.stats.ols.OLS(df.a,df.b,nw_lags=1)-------------------------Summary of Regression Analysis-------------------------Formula: Y ~ <x> + <intercept>Number of Observations:         11Number of Degrees of Freedom:   2R-squared:         0.2807Adj R-squared:     0.2007Rmse:   2.0880F-stat (1, 9):     1.5943, p-value:     0.2384Degrees of Freedom: model 1, resid 9-----------------------Summary of Estimated Coefficients------------------------      Variable       Coef    Std Err     t-stat    p-value    CI 2.5%   CI 97.5% --------------------------------------------------------------------------------  x     0.5595     0.4431       1.26     0.2384    -0.3090     1.4280     intercept     2.0576     2.9413       0.70     0.5019    -3.7073     7.8226*** The calculations are Newey-West adjusted with lags     1---------------------------------End of Summary---------------------------------


欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5643027.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-16
下一篇 2022-12-16

发表评论

登录后才能评论

评论列表(0条)

保存