Autoregressive Moving Average (ARMA): Sunspots dataΒΆ

Link to Notebook GitHub

In [ ]:
from __future__ import print_function
import numpy as np
from scipy import stats
import pandas as pd
import matplotlib.pyplot as plt

import statsmodels.api as sm
In [ ]:
from statsmodels.graphics.api import qqplot

Sunpots Data

In [ ]:
print(sm.datasets.sunspots.NOTE)
In [ ]:
dta = sm.datasets.sunspots.load_pandas().data
::

    Number of Observations - 309 (Annual 1700 - 2008)
    Number of Variables - 1
    Variable name definitions::

        SUNACTIVITY - Number of sunspots for each year

    The data file contains a 'YEAR' variable that is not returned by load.

In [ ]:
dta.index = pd.Index(sm.tsa.datetools.dates_from_range('1700', '2008'))
del dta["YEAR"]
In [ ]:
dta.plot(figsize=(12,8));
In [ ]:
fig = plt.figure(figsize=(12,8))
ax1 = fig.add_subplot(211)
fig = sm.graphics.tsa.plot_acf(dta.values.squeeze(), lags=40, ax=ax1)
ax2 = fig.add_subplot(212)
fig = sm.graphics.tsa.plot_pacf(dta, lags=40, ax=ax2)
In [ ]:
arma_mod20 = sm.tsa.ARMA(dta, (2,0)).fit()
print(arma_mod20.params)
In [ ]:
arma_mod30 = sm.tsa.ARMA(dta, (3,0)).fit()
const                49.659269
ar.L1.SUNACTIVITY     1.390656
ar.L2.SUNACTIVITY    -0.688571
dtype: float64
In [ ]:
print(arma_mod20.aic, arma_mod20.bic, arma_mod20.hqic)
In [ ]:
print(arma_mod30.params)
2622.6363380653256 2637.56970317 2628.60672591
In [ ]:
print(arma_mod30.aic, arma_mod30.bic, arma_mod30.hqic)
const                49.749903
ar.L1.SUNACTIVITY     1.300810
ar.L2.SUNACTIVITY    -0.508092
ar.L3.SUNACTIVITY    -0.129650
dtype: float64
  • Does our model obey the theory?
In [ ]:
sm.stats.durbin_watson(arma_mod30.resid.values)
2619.403628696675 2638.07033508 2626.8666135
In [ ]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
ax = arma_mod30.resid.plot(ax=ax);
In [ ]:
resid = arma_mod30.resid
In [ ]:
stats.normaltest(resid)
In [ ]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
fig = qqplot(resid, line='q', ax=ax, fit=True)
In [ ]:
fig = plt.figure(figsize=(12,8))
ax1 = fig.add_subplot(211)
fig = sm.graphics.tsa.plot_acf(resid.values.squeeze(), lags=40, ax=ax1)
ax2 = fig.add_subplot(212)
fig = sm.graphics.tsa.plot_pacf(resid, lags=40, ax=ax2)
In [ ]:
r,q,p = sm.tsa.acf(resid.values.squeeze(), qstat=True)
data = np.c_[range(1,41), r[1:], q, p]
table = pd.DataFrame(data, columns=['lag', "AC", "Q", "Prob(>Q)"])
print(table.set_index('lag'))
  • This indicates a lack of fit.
  • In-sample dynamic prediction. How good does our model do?
In [ ]:
predict_sunspots = arma_mod30.predict('1990', '2012', dynamic=True)
print(predict_sunspots)
           AC          Q      Prob(>Q)
lag
1    0.009179   0.026286  8.712027e-01
2    0.041793   0.573037  7.508733e-01
3   -0.001335   0.573596  9.024494e-01
4    0.136089   6.408916  1.706206e-01
5    0.092468   9.111828  1.046860e-01
6    0.091948  11.793245  6.674342e-02
7    0.068748  13.297202  6.518980e-02
8   -0.015020  13.369231  9.976127e-02
9    0.187592  24.641906  3.393914e-03
10   0.213718  39.321990  2.229478e-05
11   0.201082  52.361129  2.344958e-07
12   0.117182  56.804175  8.574305e-08
13  -0.014055  56.868312  1.893912e-07
14   0.015398  56.945551  3.997679e-07
15  -0.024967  57.149306  7.741510e-07
16   0.080916  59.296760  6.872189e-07
17   0.041138  59.853730  1.110947e-06
18  -0.052021  60.747420  1.548437e-06
19   0.062496  62.041682  1.831649e-06
20  -0.010302  62.076969  3.381254e-06
21   0.074453  63.926642  3.193599e-06
22   0.124955  69.154758  8.978395e-07
23   0.093162  72.071020  5.799812e-07
24  -0.082152  74.346673  4.713038e-07
25   0.015695  74.430029  8.289080e-07
26  -0.025037  74.642887  1.367290e-06
27  -0.125861  80.041129  3.722589e-07
28   0.053225  81.009964  4.716306e-07
29  -0.038693  81.523789  6.916674e-07
30  -0.016904  81.622208  1.151668e-06
31  -0.019296  81.750920  1.868776e-06
32   0.104990  85.575045  8.928012e-07
33   0.040086  86.134547  1.247516e-06
34   0.008829  86.161790  2.047837e-06
35   0.014588  86.236427  3.263827e-06
36  -0.119329  91.248877  1.084461e-06
37  -0.036665  91.723845  1.521932e-06
38  -0.046193  92.480493  1.938747e-06
39  -0.017768  92.592861  2.990698e-06
40  -0.006220  92.606684  4.697013e-06
In [ ]:
fig, ax = plt.subplots(figsize=(12, 8))
ax = dta.ix['1950':].plot(ax=ax)
fig = arma_mod30.plot_predict('1990', '2012', dynamic=True, ax=ax, plot_insample=False)
1990-12-31    167.047417
1991-12-31    140.993004
1992-12-31     94.859113
1993-12-31     46.860889
1994-12-31     11.242556
1995-12-31     -4.721343
1996-12-31     -1.166978
1997-12-31     16.185618
1998-12-31     39.021814
1999-12-31     59.449818
2000-12-31     72.170108
2001-12-31     75.376767
2002-12-31     70.436452
2003-12-31     60.731580
2004-12-31     50.201783
2005-12-31     42.076000
2006-12-31     38.114247
2007-12-31     38.454593
2008-12-31     41.963761
2009-12-31     46.869234
2010-12-31     51.423214
2011-12-31     54.399679
2012-12-31     55.321659
Freq: A-DEC, dtype: float64
In [ ]:
def mean_forecast_err(y, yhat):
    return y.sub(yhat).mean()
In [ ]:
mean_forecast_err(dta.SUNACTIVITY, predict_sunspots)

Exercise: Can you obtain a better fit for the Sunspots model? (Hint: sm.tsa.AR has a method select_order)

Simulated ARMA(4,1): Model Identification is Difficult

In [ ]:
from statsmodels.tsa.arima_process import arma_generate_sample, ArmaProcess
In [ ]:
np.random.seed(1234)
# include zero-th lag
arparams = np.array([1, .75, -.65, -.55, .9])
maparams = np.array([1, .65])

Let's make sure this model is estimable.

In [ ]:
arma_t = ArmaProcess(arparams, maparams)
In [ ]:
arma_t.isinvertible()
In [ ]:
arma_t.isstationary()
  • What does this mean?
In [ ]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
ax.plot(arma_t.generate_sample(size=50));
In [ ]:
arparams = np.array([1, .35, -.15, .55, .1])
maparams = np.array([1, .65])
arma_t = ArmaProcess(arparams, maparams)
arma_t.isstationary()
In [ ]:
arma_rvs = arma_t.generate_sample(size=500, burnin=250, scale=2.5)
In [ ]:
fig = plt.figure(figsize=(12,8))
ax1 = fig.add_subplot(211)
fig = sm.graphics.tsa.plot_acf(arma_rvs, lags=40, ax=ax1)
ax2 = fig.add_subplot(212)
fig = sm.graphics.tsa.plot_pacf(arma_rvs, lags=40, ax=ax2)
  • For mixed ARMA processes the Autocorrelation function is a mixture of exponentials and damped sine waves after (q-p) lags.
  • The partial autocorrelation function is a mixture of exponentials and dampened sine waves after (p-q) lags.
In [ ]:
arma11 = sm.tsa.ARMA(arma_rvs, (1,1)).fit()
resid = arma11.resid
r,q,p = sm.tsa.acf(resid, qstat=True)
data = np.c_[range(1,41), r[1:], q, p]
table = pd.DataFrame(data, columns=['lag', "AC", "Q", "Prob(>Q)"])
print(table.set_index('lag'))
In [ ]:
arma41 = sm.tsa.ARMA(arma_rvs, (4,1)).fit()
resid = arma41.resid
r,q,p = sm.tsa.acf(resid, qstat=True)
data = np.c_[range(1,41), r[1:], q, p]
table = pd.DataFrame(data, columns=['lag', "AC", "Q", "Prob(>Q)"])
print(table.set_index('lag'))

Exercise: How good of in-sample prediction can you do for another series, say, CPI

In [ ]:
macrodta = sm.datasets.macrodata.load_pandas().data
macrodta.index = pd.Index(sm.tsa.datetools.dates_from_range('1959Q1', '2009Q3'))
cpi = macrodta["cpi"]

Hint:

In [ ]:
fig = plt.figure(figsize=(12,8))
ax = fig.add_subplot(111)
ax = cpi.plot(ax=ax);
ax.legend();

P-value of the unit-root test, resoundly rejects the null of no unit-root.

In [ ]:
print(sm.tsa.adfuller(cpi)[1])
0.990432818834