STAT 479: Time Series Analysis
STAT 479: Time Series Analysis
Welcome to the course website for STAT 479!
Information and resources for the course can be found on this page. Click on the section headings to expand them. For assignment submission and grades please see Canvas.
Announcements:
Assignment 5 has now been posted. It is due April 11th at 11:59 PM.
The final exam is scheduled to take place on April 22nd at 8:30 AM in our classroom GSB 8-59.
The three hours long and it will consist of long answer questions. The coverage is cumulative but a larger emphasis will be placed on material in the last one third of the course. You will be allowed three double-sided sheets of notes. You shouldn't need it but you may bring a non-programmable calculator.
Some practice problems for the final exam can be found here.
I will hold extra office hours on Wednesday and Thursday April the 16th and 17th from 11:00-12:00.
I recommend that you review previous homework and quizzes as well as the solutions to the questions of each that are on Canvas.
Week 1: Stochastic processes, time series, classical decomposition into trend + seasonal + noise components, strict and weak stationarity, autocovariance and autocorrelation functions and their properties, white noise, random walks.
Week 2: Computing the ACVF for an MA(q) and the AR(1) process, bias and variance of the sample mean, sample autocovariance function, asymptotic distribution of the sample autocorrelation of white noise, OLS regression for a mean function with trends or periodicity.
Week 3: More on OLS regression, a brief discussion of GLS regression, smoothing time series via a window smoother and kernel smoothers, the backshift operator, differencing, seasonal differencing, revisiting the MA(q) process in terms of backshifts, definition of the AR(p) process, discussion on the identifiability of these processes.
Week 4: Inverting the AR(p) polynomial (see this for a short derivation of the inverse of (1- phi B) that I did in class), convergence of linear filters applied to stationary processes, existence of stationary AR(p) solutions, roots of the AR(p) polynomial, introduction to ARMA(p,q) processes.
Week 5: Finding stationary solutions to the ARMA equations. Causality and invertibility of ARMA processes. Finding phi(z)^(-1)theta(z) by matching the coefficients of polynomials.
Week 6: Computation of the ACVF for ARMA processes by solving a linear system of equations via multiplying the ARMA equation by X_{t-k}. Finding the best linear predictor of a random variable. The minimum attainable prediction error.
Week 7 – Reading week
Week 8: Examples of finding the BLPs for AR and MA processes. Making predictions k-steps ahead. Backcasting and the fact that the backcasting regression coefficients equal the forecasting regression coefficients in a reversed order. Partial autocorrelation and its interpretation as correlation in a conditional distribution. The PACF function and its behavior for AR, MA and ARMA processes. The sample PACF. Theory on the asymptotic distribution of the sample ACF (Bartlett's formula) and on the asymptotic distribution of the PACF for MA and AR processes respectively.
Week 9: The Durbin-Levinson algorithm. Predictions using the infinite past and the resulting prediction error. Prediction intervals under Gaussianity. Yule-Walker equations and estimates for AR(p) processes. The asymptotic distribution of the Yule-Walker estimates.
Week 10: Closed-form solutions via least squares for maximum likelihood estimates using the conditional likelihood for AR(p) processes. The challenge of obtaining closed-form solutions for general ARMA processes. A very brief discussion on iterative methods for optimization. The asymptotic distribution of the ARMA MLEs. Observation that an ARMA(p,q) is nested in a ARMA(p',q') for p <= p', q <= q'. Initial discussion on choosing the orders p and q.
Week 11: The AIC and AICc and how to use for model selection. ARIMA(p,d,q) processes. Unit root tests for non-stationarity (the augmented Dickey-Fuller test). SARIMA(p,d,q)x(P,D,Q)_s processes. Prof. Kashlak's Notes
Week 12: An introduction to the discrete Fourier transform (DFT). Review of some theory on complex numbers, including Euler's formula. Representing our data in terms of a summation of sines and cosines. Aliasing at frequencies larger than one half. The orthogonality relation between sines and cosines and how this can be used to find the Fourier coefficients by matrix multiplication. A brief discussion on the fast Fourier transform (FFT). The periodogram as the squared Fourier coefficients up to some rescaling at the fundamental frequencies.
Week 13: Some notes on the spectral density function and Fourier transforms can be found here. The spectral density as a DTFT of the ACVF. Various properties and the interpretation of the spectral density. The magic formula for the spectral density of an ARMA process and simple examples of this. The parametric estimator of the spectral density.
Week 14 (last week of class): The bias and variance of the periodogram, including the chi-squared behavior of the periodogram when computed on white noise. The asymptotic distribution of the periodogram for general linear processes (e.g. ARMA). Smoothing the periodogram and asymptotic theory for the smoothed periodogram estimator. Conditions on the smoothing weights that ensure that the periodogram converges to the spectral density. Bartlett's method for estimating the spectral density by dividing a data set into chunks and computing the periodogram on each chunk separately.
Course Description: Stationary series, spectral analysis, models in time series: autoregressive, moving average, ARMA and ARIMA. Smoothing series, computational techniques and computer packages for time series.
Prerequisites: STAT 372 and 378.
Grading:
Grade breakdown
5 assignments for 35% of the total grade. The lowest assignment grade is dropped.
2 quizzes, each worth 15%.
The final exam is worth 35%.
Assignments: All assignments are to be submitted on Canvas. You may scan handwritten solutions or write up solutions in LaTeX (preferred). If you choose to write up your solutions by hand please make sure that they are legible. For coding questions please submit relevant code chunks and output as part of your solution, while also including your raw code in a separate file. Assignments are meant to be completed individually without the assistance from your peers or generative AI models.
Late policy: 25% is subtracted from the grade of a given assignment for every day that this assignment is late. Assignments are due at 11:59 PM MST on the day indicated in the syllabus.
Resources:
Textbooks:
There is no required textbook for this course. However, we will loosely be following
Time Series Analysis and its Applications by Shumway and Stoffer (2009). This book can be downloaded here.
Two other books that may be useful are:
Introduction to Time Series and Forecasting by Brockwell and Davis (2016). This book can be downloaded here.
Time Series: Theory and Methods by Brockwell and Davis (1991). This book can be downloaded here.
The former book is another introductory book on time series. The latter book is more advanced, but is a classic.
Software: We will be using R throughout this course. Coding portions of assignments should be done in R.
Other resources: Professor Adam Kashlak taught a previous version of this course. His lecture recordings and course notes are bound to be helpful and can be found here and here respectively. The material covered in this version of the course will be similar, but not identical, to the material covered by Professor Kashlak.
Class Time, Office Hours, and Contact Information:
Class time: Tuesdays and Thursdays, 9:30-10:50 AM, GSB 8-59.
Office hours: Tuesdays and Thursdays, 10:50-11:50 AM, CAB 475.
My email is: mccorma2[AT]ualberta[DOT]ca
Tentative Outline:
Introduction to time series data.
Stationarity and the autocovariance function.
Estimation of means and autocovariances.
Smoothing, differencing, and trend estimation.
Introduction to AR, MA, and ARMA processes.
The AR and MA polynomials, causality, and invertibility.
Autocovariance functions for ARMA processes.
Forecasting and prediction in general and for ARMA processes.
The partial autocorrelation function.
Specialized algorithms for computing predictions.
Parameter estimation: Yule-Walker estimates and MLEs.
Model selection.
ARIMA models.
SARIMA models.
Basic time series models for discrete data.
A primer on the discrete Fourier transform and the periodogram.
The spectral density function.
Smoothed estimates of the spectral density.
ARCH and GARCH (time permitting).
Introduction to multivariate time series (time permitting).
Note: Many of the above topics will take more than one lecture to discuss. While I hope to cover most of the topics listed above, due to time constraints some topics may be omitted.
The second quiz is on March the 13th in class. You are allowed one double-sided sheet of course notes for the quiz. You shouldn't need it, but you may bring a non-programmable calculator. The quiz consists of 6 long-answer questions.
Topics included on the quiz include:
Causality, invertibility and the existence of stationary solutions for ARMA processes.
Methods for computing the ACVF of ARMA processes.
Best linear predictions and their associated prediction errors. Theoretical properties for these BLPs and how to compute them. Note that the Durbin-Levinson and innovations algorithms will not appear on the quiz.
The PACF. How to compute and how to interpret it.
How to interpret ACF and PACF plots.