An outline for
Stochastic Processes: Theory and Applications
by
Joseph T. Chang
 Introduction. Here the major classes of stochastic processes are described in general
terms and illustrated with graphs and pictures, and some of the applications are
previewed. A major purpose is to build up motivation, communicating the interest
and importance of the subject.

What is a stochastic process? Preview of Markov chains, Markov random fields,
hidden Markov models, point processes, martingales, Brownian motion, diffusions.

Motivation: brief descriptions of some applications. Applications to be treated
include: simulated annealing, stochastic approximation, Gibbs sampler, topics
in modern genetics (e.g. genetic mapping, evolutionary trees), image processing,
speech recognition, neural models, card shuffling, and finance.

Markov chains. Markov chains illustrate many of the important ideas of stochastic
processes in an elementary setting. Since the technical requirements are minimal, a
relatively complete mathematical treatment is feasible. This classical subject is still
very much alive, with exciting developments in both theory and applications coming
at an accelerating pace in recent decades. This chapter includes discussions of some of
these modern developments, such as speed of convergence to stationarity, generating
random objects, and applications to simulation and statistics.
 What is a Markov chain?
How to simulate one, Markov property.
 "It's all just matrix theory."
 Basic limit theorem: convergence of distribution to stationarity.
 Example of importance: generating a random object using Markov chains,
Markov sampling, approximate counting.
 Proof of theorem. Along the way important concepts such as recurrence and
coupling are presented in detail.
 Strong Law of Large Numbers for Markov chains.
 Branching processes.

More on recurrence: electric networks.
 Time reversibility: birth and death chains, random walks on graphs,
Metropolis algorithms.
 Speed of convergence to stationarity, card shuffling.
 Simulated annealing.
 Markov sampling (Gibbs sampling, Markov chain Monte Carlo) and Bayesian
statistics. A (discrete) change point example.
 Markov chains in general state spaces. In many applications, the restriction
to discrete spaces is unnatural and undesirable. This chapter focuses on extending
the basic limit theorem from discrete to much more general state spaces. Extending the coupling idea to the more general setting is the interesting challenge. The
mathematical level is a bit higher than in the previous chapter.
 Introduction, examples, and stationary distributions.
 Chains with an atom.

Harris chains.
 Poisson processes. Poisson processes are a simple class of models for the occurrence
of events in space and time. They also form a basic building block for many more
complicated models in a variety of fields. This material is basic and important.
 Definitions and the classical elementary properties, such as superposition and
relation to order statistics of uniform distribution.
 Generalizations: nonhomogeneous processes, higher dimensions.
 Applications to record values, simple queues, models of neurons, genetic models
of mutations and crossovers.
 Continuoustime Markov chains. This will be a relatively short chapter; given
the development of discretetime Markov chains, continuous time tends to strike students at this level as a rather minor variation. Technically, continuous time makes
certain analyses simpler; this chapter will discuss some applications of this type .
 How continuoustime chains work: rate matrix, forward and backward equations.
 Applications to queueing networks and neural network models.
 Markov random fields and hidden Markov models. A modern subject currently
in vigorous development, this generalization of the Markov chain concept has been
used successfully in an impressive variety of applications. This chapter gives a bit of
theory, but emphasizes the applications.
 Idea of Markov random fields and hidden Markov models
 HammersleyClifford Theorem.
 Ising model and phase transitions.
 Simulation of Markov random fields by Markov sampling.
 Hidden Markov models: parameter estimation and statistical inference.
 Applications to speech recognition, image reconstruction, DNA sequence alignment, evolutionary trees, ion channels, expert systems.
 Renewal theory.
This is a classical topic in pure and applied probability. This
material interacts little with the other chapters (other than Markov chains), so it
could be inserted in a variety of positions in a course (and the book). Renewal theory
is particularly emphasized in certain applied areas such as operations research, and it
can be a useful tool in the analysis of other processes. Students appreciate a simple
explanation of the renewal theorem in terms of a jumping rabbit and they find the
"inspection paradox" and related issues interesting.
 The basic issues: jumping rabbits and changing lightbulbs.
 Renewal theorem, renewal equations.
 Applications to overshoot, total lifetime, "inspection paradox," runs and patterns.
 Defective renewal equations and applications to re ected random walks and quick
detection of a change in distribution.
 Martingales. This pillar of modern probability theory is interesting in its own
right, as well as a useful tool in the analysis of other classes of stochastic processes.
The presentation of the theory makes use of some basic properties of conditional
expectations, without an elaborate measuretheoretic development.
 Definitions and examples.
 Stopping times and optional sampling theorems.
Intuition: "Conservation of fairness" idea in playing games.
 Martingale convergence theorems.
 Applications.
 Absorption and ruin probabilities for Markov chains and random walks.
 Branching processes.
 Finance: option pricing and martingale representation in discrete time.
 Stochastic approximation:
 Proof of convergence of RobbinsMonro procedure
 explanation of relationship to clustering algorithms and backpropagation
for neural networks.
 Brownian motion. This process is of fundamental importance, and many students
will have heard of it already. A full study is mathematically demanding, and the
whole subject is typically intimidating to beginners. This chapter introduces the
mathematical fundamentals with plenty of conceptual and intuitive discussion.
 Explanation of the definition.
 Visualizing Brownian motion.
 Sampling a BM arbitrarily finely, normal random walks.
 Explanation of strange and scary pathologies: nondifferentiability and infinitely many zeroes.

Stopping times, strong Markov property (intuitive explanation) and re ection
principle.
 Conditional distributions.
 Construction of BM: connectthedots.
 Brownian bridge. Application to testing for uniformity.
 Two approaches to a boundary crossing problem: martingales and differential
equations.
 Introduction to weak convergence, Donsker's theorem, Brownian approximations
for random walks.
 Diffusions, stochastic calculus, stochastic differential equations.
Diffusions
generalize Brownian motion, allowing a much wider variety of phenomena to be modelled and studied. This material builds on the previous chapter.
Again, the mathematical derivations stop short of full rigor, but attempt to provide insight and intuition.
 Idea of diffusions and stochastic differential equations. Infinitesimal drift and
variance functions. Brownian motions as stochastic analog of linear functions,
diffusions as stochastic solutions of differential equations.
 The method of differential equations.
 Kolmogorov backward and forward equations, stationary distributions.
 Boundary behavior: re ection and absorption.

Ito integrals, Ito's formula.
 Applications: finance (option pricing and BlackScholes formula), approximating
random walks and Markov chains by diffusions, population genetics, stochastic
control (e.g. optimal portfolio choice), Kalman filtering, testing for unit roots in
time series.
 Likelihood ratios and extremes.
Boundary crossing problems for stochastic processes arise frequently in
applications. For example, many statistical procedures raise an alarm or reject a
hypothesis when a certain stochastic process reaches a certain height. This
chapter explores some interesting and powerful techniques for determining or
approximating probabilities of this type of event.
 Poisson clumping heuristic.
 Explanation of the idea and a worked out example (the
OrnsteinUhlenbeck process)
 Applications: e.g. genetic mapping.
 Change of measure, likelihood ratios, Wald likelihood ratio identity.
 Derivation of inverse Gaussian distribution.
 Conditioning a random walk or Brownian motion having negative drift
to reach a high level.
Comment on "NeoDarwinism implies punctuated equilibrium."
 Optimality of the sequential probability ratio test.
 Alternative derivation of BlackScholes.
 Importance sampling: a simulation technique for rare events.
 Introduction to large deviations.
 Appendices.
 Topics from mathematics and probability. This is a presentation of background
material that may be reviewed as needed.
 Indicator random variables. A review of a technique that is used throughout
the book.
 Conditioning. Conditional probabilities and expectations are an important topic about which students typically have a shaky understanding.
 Monotone convergence, dominated convergence, and all that. A statement of some basic results from measure theory that are used in a few proofs.
These are viewed largely as technicalities, although some examples involving
stopping times and martingales show that they can occasionally become a
central issue.
 Infinity and limsupery. Here some basic notions from elementary analysis
and probability are reviewed. These include infimum and supremum, liminf
and limsup. I want the students to be able to understand a definition like
"tau_{i} = inf{n > 0 : X_{n} = i}," and to appreciate a distinction
like "T is not
only finite with probability 1, but also has finite expectation."
 Simulation. How to generate random variables from some standard distributions such as exponential and normal.
 Programs. A listing and description of computer programs used in the book.
(I have mostly been using Mathematica for computation.) These will also be
accessible electronically via the Internet.