Time Series in Python — Part 2: Dealing with seasonal data

Time Series in Python — Part 2: Dealing with seasonal dataBenjamin EtienneBlockedUnblockFollowFollowingFeb 15In the first part, you learned about trends and seasonality, smoothing models and ARIMA processes.

In this part, you’ll learn how to deal with seasonal models and how to implement Seasonal Holt-Winters and Seasonal ARIMA (SARIMA).

Getting the dataWe’ll use the “Monthly milk production” data:Seasonal decomposition (TLS)In the previous part, I talked briefly about seasonal decomposition.

The idea beneath seasonal decomposition is to state that any series can be decomposed in a sum (or a product) of 3 components: a trend, a seasonal component, and residuals.

In our case, we’ll use the seasonal_decompose function provided by statsmodels:The trend component is an increasing curve which seems to reach a plateau and eventually decrease at the end.

This is a useful information when we will remove the trend to stationarize the data.

Stationarizing the dataDealing with the trendBased on our decomposition, we see that the trend is following an upwards movement before plateauing (and maybe decreasing) at the end.

In order to remove the trend, we will try an original approach, consisting in regressing the trend given by the STL decomposition.

We will then try, if the regression if satisfying, to “deflate” the series by substracting the obtained regression from the original series.

The r-squared coefficient from our regression is quite good (0.

988), but looking at the residuals we notice that their variance increase from left to right.

This is a sign of heteroskedasticity.

You can look up the exact definition by yourself, just keep in mind that this is not good and that something is missing in our model.

We’ll abandon the idea of regressing the trend as it requires too much effort for now.

Dealing with the trend, ep.

2Our series still needs stationarizing, we’ll go back to basic methods to see if we can remove this trend.

We’ll operate in several steps :ADF test on raw data to check stationarityADF test on the 12-month differenceADF test on the 12-month difference of the logged dataADF test on the data minus its 12-month moving averageIf you need the code, refer to the first part to learn how to conduct an ADF testLet’s confirm that the data minus its 12-month MA is stationary with a KPSS test :The test statistic is below the thresholds, therefore our series is stationary (recall that for the KPSS test, the null hypothesis is that the series is stationary).

Fitting a Holt-Winter’s Seasonal Smoothing modelRemember from the first part that a Holt-Winter’s model has several parts : a level, a trend, and in the case of a seasonal smoothing, a seasonal component.

The math behind this is a bit hard so I won’t put it here, just remember the three components above.

Here is what we get after fitting our model :The model without damping seems to perform better, as the AIC is lower.

Seasonal ARIMAGood old ARIMA is back, but this time we’ll add a seasonal component to it !.A SARIMA model is composed of two parts: a non-seasonal part and a seasonal part.

If you remember the previous part, the non-seasonal part is composed of an AR(p) term, a MA(q) term, and a I(d) term, and we write ARIMA (p, d, q).

Well it’s the same for the seasonal part ! Our SARIMA is noted ARIMA (p, d, q)x(P, D, Q).

Question : how do we determine all these p, d, q, P, D, Q ?Each time you hear “ARIMA”, think “ACF/PACF”.

With the ACF and PACF plots, we’ll be able to guess reasonable values for our parameters.

Let’s plot the ACF and PACF plots of our stationarized data :The data is clearly not stationary given the slow decay of spikes observed in the ACF.

This is a sign that we should take an additional difference for x:Our series is now stationary.

To sum up, we have :taken a 12-lag difference of the stationarized series.

This corresponds to a “1-seasonal-lag” difference.

taken a 1-lag difference of the 12-lag differenced series.

This corresponds to a non-seasonal difference order of 1.

This gives us an SARIMA(?, 1, ?)x(?, 1, ?).

What about the AR and MA terms ?Well, we know that there are still significant spikes in the ACF and PACF, so we could decide to add AR terms as well as MA terms.

Which one should we start with ?.For this time, we’ll use a grid search over the possible combinations on AR and MA terms (limited to a maximum order of 2)…When running different configurations, the lowest AIC is obtained with a SARIMA(0, 1, 2)x(0, 1, 2).

With the plot_diagnotics() command, we can show the residuals and check their normality ( — remember, the residuals should be normally distributed (zero mean and unitary variance) and be uncorrelated).

Notice how the normal Q-Q plot and the histogram show that our residuals follow a normal distribution (we could use additional tests to test for normality, such as Ljung-Box).

The correlogram also shows that the residuals are uncorrelated !Additional lectures:I highly recommended the two below if you are interested in time-series forecasting and analysis:Forecasting, Principles and Practice (Hyndman & Athanasopoulos)Statistical Forecasting (Robert Nau)You can also check this article on Neural Turing Machines and this article on Attention-based Neural Networks !.. More details

Leave a Reply