Applied Stochastic Processes

Term

Winter 2026

Updated

March 16, 2026

From the [course syllabus]:

Stochastic processes are indexed collections of random variables used to describe phenomena in which a dependence structure arises from evolution across time (or space). Markov processes, in particular, are stochastic processes in which dependence is local: given the current state (or values on a separating boundary), the future (outside the boundary) is conditionally independent of the past (or interior history). Markov processes have rich applications in epidemiology, finance, biology, social science, engineering, chemistry, and beyond, and they are also important in statistics. In particular, Markov chain Monte Carlo (MCMC) methods are central to modern Bayesian statistics as a means to approximate complex posterior distributions for Bayesian inference via simulation. This course is a graduate-level introduction to Markov processes and Markov chains, covering four key areas: discrete-time models, continuous-time models, MCMC, and, briefly, Brownian motion and Gaussian processes. Students can expect to learn core concepts and probabilistic language for describing Markov processes, gain exposure to common models and estimation methods, and explore applications.

Instructor: Trevor Ruiz (he/him) [email]

Class meetings: 10:10am–12:00pm MW in 10-124

Office hours: MW 1:00pm–2:30pm and [by appointment] in 25-236 or via Zoom; drop-ins are welcome but appointments are recommended/appreciated.

Week 1 (1/5)

Monday: introduction to Markov chains

Wednesday: transition probabilities

Week 2 (1/12)

Monday: limiting and stationary distributions

  • [hw2] exercises 3.7, 3.8, 3.14a-b, 3.22, 3.10, 3.63 (copy of utilities.R), due Tuesday 1/20
  • [reading] 2.4, 3.1, 3.2
  • [R script] exploring limits
  • [lecture notes]

Wednesday: finding stationary distributions

Week 3 (1/20)

MLK Day observed; Tuesday follows Monday schedule

Tuesday: recurrence, transience, and periodicity

  • [hw3] exercises 3.23, 3.29, 3.52, 3.54, 3.66; optionally, 3.64
  • [reading] 3.3, 3.5
  • [lecture notes]

Wednesday: limit theorem for finite Markov chains

Week 4 (1/26)

Monday: likelihood estimation

  • [reading] Guttorp, P. (1995). Stochastic modeling of scientific data. Chapter 2, section 2.7. [pdf]
  • [reference] Anderson, T. W., & Goodman, L. A. (1957). Statistical inference about Markov chains. The Annals of Mathematical Statistics, 28(1); 89-110. [pdf]
  • [R script] weather in SLO
  • [lecture notes]

Wednesday: Bayesian estimation

Week 5 (2/2)

Groundhog Day not observed (but see data on Phil’s predictions and some simple models)

Monday: mini-project 1

Wednesday: midterm

Week 6 (2/9)

Monday: Metropolis-Hastings algorithm

  • [hw4] 5.7, 5.8, 5.19, 5.18, and your in-class example 5.17 and add a part (c) identify the likelihood and prior that produce this posterior in a Bayesian framework
  • [reading] 5.1, 5.2
  • [R script] uncertainty quantification
  • [R script] Dirichlet-multinomial MCMC
  • [lecture notes]

Wednesday: random walk Metropolis; Gibbs sampler

Week 7 (2/17)

President’s Day observed

Wednesday: Gibbs sampler

Week 8 (2/23)

Monday: homogeneous Poisson processes

  • [hw6] 6.12, 6.23, 6.35, 6.41, 6.42, 6.43
  • [reading] 6.1, 6.2, 6.5
  • [lecture notes]

Wednesday: spatial and nonhomogeneous Poisson processes

Week 9 (3/2)

Monday: estimation of nonhomogeneous intensity; point processes

Wednesday: mini-project 2

Week 10 (3/9)

Monday: introduction to Brownian motion (asynch; no class meeting)

  • [reading] 8.1-8.2
  • [notebook] Brownian motion as a limit of random walks

Wednesday: Brownian motion and applications

Finals week (3/16)

Wednesday: Final exam 10:10am–1:00pm in 10-124.