Provides an introduction to basic structures of probability with a view towards applications in information technology
A First Course in Probability and Markov Chains presents an introduction to the basic elements in probability and focuses on two main areas. The first part explores notions and structures in probability, including combinatorics, probability measures, probability distributions, conditional probability, inclusion-exclusion formulas, random variables, dispersion indexes, independent random variables as well as weak and strong laws of large numbers and central limit theorem. In the second part of the book, focus is given to Discrete Time Discrete Markov Chains which is addressed together with an introduction to Poisson processes and Continuous Time Discrete Markov Chains. This book also looks at making use of measure theory notations that unify all the presentation, in particular avoiding the separate treatment of continuous and discrete distributions.
A First Course in Probability and Markov Chains:
Presents the basic elements of probability.
Explores elementary probability with combinatorics, uniform probability, the inclusion-exclusion principle, independence and convergence of random variables.
Features applications of Law of Large Numbers.
Introduces Bernoulli and Poisson processes as well as discrete and continuous time Markov Chains with discrete states.
Includes illustrations and examples throughout, along with solutions to problems featured in this book.
The authors present a unified and comprehensive overview of probability and Markov Chains aimed at educating engineers working with probability and statistics as well as advanced undergraduate students in sciences and engineering with a basic background in mathematical analysis and linear algebra.
Table of Contents
- 1.1 Binomial coefficients
- 1.2 Sets, permutations and functions
- 1.3 Drawings
- 1.4 Grouping
2 Probability measures
- 2.1 Elementary probability
- 2.2 Basic facts
- 2.3 Conditional probability
- 2.4 Inclusion–exclusion principle
3 Random variables
- 3.1 Random variables
- 3.2 A few discrete distributions
- 3.3 Some absolutely continuous distributions
4 Vector valued random variables
- 4.1 Joint distribution
- 4.2 Covariance
- 4.3 Independent random variables
- 4.4 Sequences of independent random variables
5 Discrete time Markov chains
- 5.1 Stochastic matrices
- 5.2 Markov chains
- 5.3 Some characteristic parameters
- 5.4 Finite stochastic matrices
- 5.5 Regular stochastic matrices
- 5.6 Ergodic property
- 5.7 Renewal theorem
- 6 An introduction to continuous time Markov chains
- Appendix A Power series
Appendix B Measure and integration
- B.1 Measures
- B.2 Measurable functions and integration
- B.3 Product measures and iterated integrals
- B.4 Convergence theorems
- Appendix C Systems of linear ordinary differential equations
- Title: A First Course in Probability and Markov Chains
- Release date: January 2013
- Publisher(s): Wiley
- ISBN: 9781119944874