AM 120 Tentative Course Outline
The fundamental topic of this course is random (or "stochastic")
motion. Such motion is modeled by what are called stochastic
processes. The stochastic processes studied in the course are in
some sense the simplest possible ones, namely, finite and countable
state Markov chains. We will study many aspects of Markov chains:
how they are used to model "real life" situations, their qualitative
properties, how to compute with Markov chains, and how to control and
regulate random processes. As two main examples of the use of
Markov chains, some time will be spent on their application to problems
of speech recognition and pricing of options. A significant
component of the course will be a computational project, which can be
on either of these topics or one of the student's own choosing.
A preliminary and rough breakdown of the course is as follows.
- Class organization and background
- Review of deterministic dynamic programming
- Discrete time Markov chains
- State space and transition probabilities
- The gambler's ruin problem
- n-step transition probabilities
- Chapman-Kolmogorov equations
- Classification of states
- The strong Markov property
- Stationary distributions, convergence to equilibrium, and
coupling
- First passage times
- Absorbing states
- Computational methods
- Hidden Markov models
- Application to speech recognition
- Risk neutral probabilities in finance
- Pricing of options I
- Stochastic dynamic programming
- Pricing of options II
- Continuous time Markov chains
- The exponential distribution and the Poisson process
- Birth and death processes
- Queueing models
- Stationary distributions
Some key dates for the course are as follows.
- 27 January (Thursday) - First class
- 22 February (Tuesday) - No class
- 10 March (Thursday) - Tentative date for midterm
- 29 March and 1 April - Mid-semester break
- 12 May (Thursday) - Final exam at 9 a.m.