6

Markov Renewal Processes

6.1 Introduction

A Markov renewal process is a stochastic process that is a combination of Markov chains and renewal processes. It can be described as a vector-valued process from which processes, such as the Markov chain, semi-Markov process (SMP), Poisson process, and renewal process, can be derived as special cases of the process. Before we discuss Markov renewal process, we first provide basic introduction to renewal processes and regenerative processes.

6.2 Renewal Processes

Consider an experiment that involves a set of identical light bulbs whose lifetimes are independent. The experiment consists of using one light bulb at a time, and when it fails it is immediately replaced by another light bulb from the set. Each ...

Get Markov Processes for Stochastic Modeling, 2nd Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.