Markov chains are a fundamental class of stochastic processes. They are widely used to solve problems in a large number of domains such as operational research, computer science, communication networks and manufacturing systems. The success of Markov chains is mainly due to their simplicity of use, the large number of available theoretical results and the quality of algorithms developed for the numerical evaluation of many metrics of interest. The author presents the theory of both discrete-time and continuous-time homogeneous Markov chains. He carefully examines the explosion phenomenon, the Kolmogorov equations, the convergence to equilibrium and the passage time distributions to a state and to a subset of states. These results are applied to birth-and-death processes. He then proposes a detailed study of the uniformization technique by means of Banach algebra. This technique is used for the transient analysis of several queuing systems. Contents 1. Discrete-Time Markov Chains 2. Continuous-Time Markov Chains 3. Birth-and-Death Processes 4. Uniformization 5. Queues About the Authors Bruno Sericola is a Senior Research Scientist at Inria Rennes - Bretagne Atlantique in France. His main research activity is in performance evaluation of computer and communication systems, dependability analysis of fault-tolerant systems and stochastic models.
Preface ix
Chapter 1 Discrete-Time Markov Chains 1 (88)
1.1 Definitions and properties 1 (4)
1.2 Strong Markov property 5 (3)
1.3 Recurrent and transient states 8 (4)
1.4 State classification 12 (2)
1.5 Visits to a state 14 (4)
1.6 State space decomposition 18 (4)
1.7 Irreducible and recurrent Markov 22 (8)
chains
1.8 Aperiodic Markov chains 30 (4)
1.9 Convergence to equilibrium 34 (7)
1.10 Ergodic theorem 41 (12)
1.11 First passage times and number of 53 (15)
visits
1.11.1 First passage time to a state 53 (5)
1.11.2 First passage time to a subset 58 (6)
of states
1.11.3 Expected number of visits 64 (4)
1.12 Finite Markov chains 68 (2)
1.13 Absorbing Markov chains 70 (6)
1.14 Examples 76 (11)
1.14.1 Two-state chain 76 (2)
1.14.2 Gambler's ruin 78 (4)
1.14.3 Success runs 82 (5)
1.15 Bibliographical notes 87 (2)
Chapter 2 Continuous-Time Markov Chains 89 (102)
2.1 Definitions and properties 92 (1)
2.2 Transition functions and 93 (15)
infinitesimal generator
2.3 Kolmogorov's backward equation 108(6)
2.4 Kolmogorov's forward equation 114(13)
2.5 Existence and uniqueness of the 127(3)
solutions
2.6 Recurrent and transient states 130(7)
2.7 State classification 137(4)
2.8 Explosion 141(7)
2.9 Irreducible and recurrent Markov 148(14)
chains
2.10 Convergence to equilibrium 162(4)
2.11 Ergodic theorem 166(6)
2.12 First passage times 172(12)
2.12.1 First passage time to a state 172(5)
2.12.2 First passage time to a subset 177(7)
of states
2.13 Absorbing Markov chains 184(6)
2.14 Bibliographical notes 190(1)
Chapter 3 Birth-and-Death Processes 191(44)
3.1 Discrete-time birth-and-death 191(9)
processes
3.2 Absorbing discrete-time 200(8)
birth-and-death processes
3.2.1 Passage times and convergence to 201(3)
equilibrium
3.2.2 Expected number of visits 204(4)
3.3 Periodic discrete-time 208(1)
birth-and-death processes
3.4 Continuous-time pure birth processes 209(4)
3.5 Continuous-time birth-and-death 213(15)
processes
3.5.1 Explosion 215(2)
3.5.2 Positive recurrence 217(3)
3.5.3 First passage time 220(5)
3.5.4 Explosive chain having an 225(1)
invariant probability
3.5.5 Explosive chain without invariant 226(1)
probability
3.5.6 Positive or null recurrent 227(1)
embedded chain
3.6 Absorbing continuous-time 228(5)
birth-and-death processes
3.6.1 Passage times and convergence to 229(2)
equilibrium
3.6.2 Explosion 231(2)
3.7 Bibliographical notes 233(2)
Chapter 4 Uniformization 235(52)
4.1 Introduction 235(2)
4.2 Banach spaces and algebra 237(6)
4.3 Infinite matrices and vectors 243(6)
4.4 Poisson process 249(14)
4.4.1 Order statistics 252(3)
4.4.2 Weighted Poisson distribution 255(3)
computation
4.4.3 Truncation threshold computation 258(5)
4.5 Uniformizable Markov chains 263(10)
4.6 First passage time to a subset of 273(2)
states
4.7 Finite Markov chains 275(1)
4.8 Transient regime 276(10)
4.8.1 State probabilities computation 276(4)
4.8.2 First passage time distribution 280(2)
computation
4.8.3 Application to birth-and-death 282(4)
processes
4.9 Bibliographical notes 286(1)
Chapter 5 Queues 287(94)
5.1 The M/M/1 queue 288(27)
5.1.1 State probabilities 290(21)
5.1.2 Busy period distribution 311(4)
5.2 The M/M/c queue 315(3)
5.3 The M/M/∞ queue 318(5)
5.4 Phase-type distributions 323(3)
5.5 Markovian arrival processes 326(16)
5.5.1 Definition and transient regime 326(10)
5.5.2 Joint distribution of the 336(5)
interarrival times
5.5.3 Phase-type renewal processes 341(1)
5.5.4 Markov modulated Poisson processes 342(1)
5.6 Batch Markovian arrival process 342(10)
5.6.1 Definition and transient regime 342(7)
5.6.2 Joint distribution of the 349(3)
interarrival times
5.7 Block-structured Markov chains 352(18)
5.7.1 Transient regime of SFL chains 354(9)
5.7.2 Transient regime of SFR chains 363(7)
5.8 Applications 370(10)
5.8.1 The M/PH/1 queue 370(2)
5.8.2 The PH/M/1 queue 372(1)
5.8.3 The PH/PH/1 queue 372(1)
5.8.4 The PH/PH/c queue 373(3)
5.8.5 The BMAP/PH/1 queue 376(1)
5.8.6 The BMAP/PH/c queue 377(3)
5.9 Bibliographical notes 380(1)
Appendix 1 Basic Results 381(6)
Bibliography 387(8)
Index 395