DS-GA 1018: Probabilistic Time Series Analysis

Midterm I Practice Problems

Course: DS-GA 1018: Probabilistic Time Series Analysis
Date: Wednesday, October 15th

These questions are not due, they are just for your benefit. The TAs will go over the solutions in the lab session before the midterm.


Problem 1 (10 points)

Consider the graphical model below:

X₁ → X₂ → X₃ → X₄ → X₅
 ↓        ↑↓         ↑
 └────────┘└─────────┘

(i) (2 points)

Write down the factorization of p(X_{1:5}) implied by the graphical model.

(ii) (2 points)

What is the Markov boundary of X_4?

(iii) (2 points)

What is the Markov boundary of X_2?

(iv) (4 points)

Write down the factorization of p(X_{1:3},X_{5}|X_4).


Problem 2 (10 points)

Consider an AR(2) process with the equations:

P(B) = (1-0.2B)(1-0.5B).

Please answer the following questions:

(i) (2 points)

Is the process causal?

(ii) (8 points)

What is the correlation function \rho(t,t+h)=\rho(h)? Hint: remember that \rho(0) = 1.


Problem 3 (10 points)

Consider a causal MA(1) process of the form:

X_t = W_t + \theta_1 W_{t-1},

with 0 < \theta_1 < 1 and W_t \sim \mathcal{N}(0, 1). Assume that the series extends backwards in time to t\to -\infty.

(i) (8 points)

Assume that we have observations \{x_1,x_2\}. Derive the mean and variance of a future observation x_t for t=3.

(ii) (2 points)

Assume that we have observations \{x_1,x_2\}. Derive the mean and variance of a future observation x_t for t>3. Explain the intuition behind this result.


Problem 4 (6 points)

Consider the Kalman filtering and Kalman smoothing equations of our LDS model.

(i) (3 points)

Why do we say that the Kalman gain matrix, \mathbf{K}_t, determines how important the current observation is to updating our beliefs about our latent space? Point to specific equations in your argument.

(ii) (3 points)

Why can it be said that the filtering matrix, \mathbf{F}_t, determines how important future observations are to updating our beliefs about our latent space? Point to specific equations in your argument.