statsmodels.regression.linear_model.GLSAR

class statsmodels.regression.linear_model.GLSAR(endog, exog=None, rho=1, missing='none', hasconst=None, **kwargs)[source]

Generalized Least Squares with AR covariance structure

Parameters:
endogarray_like

A 1-d endogenous response variable. The dependent variable.

exogarray_like

A nobs x k array where nobs is the number of observations and k is the number of regressors. An intercept is not included by default and should be added by the user. See statsmodels.tools.add_constant.

rhoint

The order of the autoregressive covariance.

missingstr

Available options are ‘none’, ‘drop’, and ‘raise’. If ‘none’, no nan checking is done. If ‘drop’, any observations with nans are dropped. If ‘raise’, an error is raised. Default is ‘none’.

hasconstNone or bool

Indicates whether the RHS includes a user-supplied constant. If True, a constant is not checked for and k_constant is set to 1 and all result statistics are calculated as if a constant is present. If False, a constant is not checked for and k_constant is set to 0.

**kwargs

Extra arguments that are used to set model properties when using the formula interface.

Notes

GLSAR is considered to be experimental. The linear autoregressive process of order p–AR(p)–is defined as: TODO

Examples

>>> import statsmodels.api as sm
>>> X = range(1,8)
>>> X = sm.add_constant(X)
>>> Y = [1,3,4,5,8,10,9]
>>> model = sm.GLSAR(Y, X, rho=2)
>>> for i in range(6):
...     results = model.fit()
...     print("AR coefficients: {0}".format(model.rho))
...     rho, sigma = sm.regression.yule_walker(results.resid,
...                                            order=model.order)
...     model = sm.GLSAR(Y, X, rho)
...
AR coefficients: [ 0.  0.]
AR coefficients: [-0.52571491 -0.84496178]
AR coefficients: [-0.6104153  -0.86656458]
AR coefficients: [-0.60439494 -0.857867  ]
AR coefficients: [-0.6048218  -0.85846157]
AR coefficients: [-0.60479146 -0.85841922]
>>> results.params
array([-0.66661205,  1.60850853])
>>> results.tvalues
array([ -2.10304127,  21.8047269 ])
>>> print(results.t_test([1, 0]))
<T test: effect=array([-0.66661205]), sd=array([[ 0.31697526]]),
 t=array([[-2.10304127]]), p=array([[ 0.06309969]]), df_denom=3>
>>> print(results.f_test(np.identity(2)))
<F test: F=array([[ 1815.23061844]]), p=[[ 0.00002372]],
 df_denom=3, df_num=2>

Or, equivalently

>>> model2 = sm.GLSAR(Y, X, rho=2)
>>> res = model2.iterative_fit(maxiter=6)
>>> model2.rho
array([-0.60479146, -0.85841922])
Attributes:
df_model

The model degree of freedom.

df_resid

The residual degree of freedom.

endog_names

Names of endogenous variables.

exog_names

Names of exogenous variables.

Methods

fit([method, cov_type, cov_kwds, use_t])

Full fit of the model.

fit_regularized([method, alpha, L1_wt, ...])

Return a regularized fit to a linear regression model.

from_formula(formula, data[, subset, drop_cols])

Create a Model from a formula and dataframe.

get_distribution(params, scale[, exog, ...])

Construct a random number generator for the predictive distribution.

hessian(params)

The Hessian matrix of the model.

hessian_factor(params[, scale, observed])

Compute weights for calculating Hessian.

information(params)

Fisher information matrix of model.

initialize()

Initialize model components.

iterative_fit([maxiter, rtol])

Perform an iterative two-stage procedure to estimate a GLS model.

loglike(params)

Compute the value of the Gaussian log-likelihood function at params.

predict(params[, exog])

Return linear predicted values from a design matrix.

score(params)

Score vector of model.

whiten(x)

Whiten a series of columns according to an AR(p) covariance structure.

Properties

df_model

The model degree of freedom.

df_resid

The residual degree of freedom.

endog_names

Names of endogenous variables.

exog_names

Names of exogenous variables.


Last update: Dec 14, 2023