Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Discrete Time Markov Chains In this chapter we introduce discrete time Markov chains. For these models both time
 

Summary: Chapter 3
Discrete Time Markov Chains
In this chapter we introduce discrete time Markov chains. For these models both time
and space are discrete. We will begin by introducing the basic model, and provide
some examples. Next, we will construct a Markov chain using only independent
uniformly distributed random variables. Such a construction will demonstrate how
to simulate a discrete time Markov chain, which will also be helpful in the continuous
time setting of later chapters. Finally, we will develop some of the basic theory of
discrete time Markov chains.
3.1 The Basic Model
Let Xn, n = 0, 1, 2 . . . , be a discrete time stochastic process with a discrete state
space S. Recall that S is said to be discrete if it is either finite or countably infinite.
Without loss of generality, we will nearly always assume that S is either {1, . . . , N}
or {0, . . . , N - 1} in the finite case, and either {0, 1, . . . } or {1, 2, . . . } in the infinite
setting.
To understand the behavior of such a process, we would like to know the values
of
P{X0 = i0, X1 = i1, , Xn = in}, (3.1)
for every n and every finite sequence of states i0, . . . , in S. Note that having such
finite dimensional distributions allows for the calculation of any path probability. For

  

Source: Anderson, David F. - Department of Mathematics, University of Wisconsin at Madison

 

Collections: Mathematics