<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [{
"@type": "Question",
"name": "What is a Markov Chain?",
"text": "A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed. In other words, the probability of transitioning to any particular state is dependent solely on the current state and time elapsed."
}
},{
"@type": "Question",
"name": "What are popular applications of Markov chains?",
"text": "1. Physics
2. Chemistry.
3. BiologyTesting.
5. Speech recognition.
6. Information theory.
7. Queueing theory.
8. Internet applications.
9. Statistics.
10. Economics and finance.
11. Social sciences.
12. Games."
}
}]
}
</script>

# Markov chain

## What is a Markov Chain?

A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed. In other words, the probability of transitioning to any particular state is dependent solely on the current state and time elapsed. The state space, or set of all possible states, can be anything: letters, numbers, weather conditions, baseball scores, or stock performances.

Markov chains may be modeled by finite state machines, and random walks provide a prolific example of their usefulness in mathematics. They arise broadly in statistical and information-theoretical contexts and are widely employed in economics, game theory, queueing (communication) theory, genetics, and finance. While it is possible to discuss Markov chains with any size of state space, the initial theory and most applications are focused on cases with a finite (or countably infinite) number of states.

## What are the types of Markov chains?

The system's state space and time parameter index need to be specified.

For countable state space:

• Discrete-time: Markov chain on a countable or finite state space
• Continuous-time: Markov process or Markov jump process

For continuous or general state space:

• Discrete-time: Markov chain on a measurable state space (for example, Harris chain)
• Continuous-time: Any continuous stochastic process with the Markov property (for example, the Wiener process)

## What are the properties of a Markov chain?

A variety of descriptions of either a specific state in a Markov chain or the entire Markov chain allow for better understanding of the Markov chain's behavior. Let PP be the transition matrix of Markov chain {X0, X1, … }.

• A state i has period k ≥ 1 if any chain starting at and returning to state i with positive probability must take a number of steps divisible by k. If k = 1, then the state is known as aperiodic, and if k > 1, the state is known as periodic. If all states are aperiodic, then the Markov chain is known as aperiodic.
• A Markov chain is known as irreducible if there exists a chain of steps between any two states that has positive probability.
• An absorbing state i is a state for which Pi,i = 1. Absorbing states are crucial for the discussion of absorbing Markov chains.
• A state is known as recurrent or transient depending upon whether or not the Markov chain will eventually return to it. A recurrent state is known as positive recurrent if it is expected to return within a finite number of steps, and null recurrent otherwise.
• A state is known as ergodic if it is positive recurrent and aperiodic. A Markov chain is ergodic if all its states are.

Irreducibility and periodicity both concern the locations a Markov chain could be at some later point in time, given where it started. Stationary distributions deal with the likelihood of a process being in a certain state at an unknown point of time. For Markov chains with a finite number of states, each of which is positive recurrent, an aperiodic Markov chain is the same as an irreducible Markov chain.

## What are popular applications of Markov chains?

Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, currency exchange rates and animal population dynamics.

It’s also important in many fields, such as:

• Physics
• Chemistry
• Biology
• Testing
• Speech recognition
• Information theory
• Queueing theory
• Internet applications
• Statistics
• Economics and finance
• Social sciences
• Games
• Music
• Baseball
• Markov text generators

Engati powers 45,000+ chatbot & live chat solutions in 50+ languages across the world.

We aim to empower you to create the best customer experiences you could imagine.

So, are you ready to create unbelievably smooth experiences?

# Markov chain

October 14, 2020

Key takeawaysCollaboration platforms are essential to the new way of workingEmployees prefer engati over emailEmployees play a growing part in software purchasing decisionsThe future of work is collaborativeMethodology

## What is a Markov Chain?

A Markov chain is a mathematical system that experiences transitions from one state to another according to certain probabilistic rules. The defining characteristic of a Markov chain is that no matter how the process arrived at its present state, the possible future states are fixed. In other words, the probability of transitioning to any particular state is dependent solely on the current state and time elapsed. The state space, or set of all possible states, can be anything: letters, numbers, weather conditions, baseball scores, or stock performances.

Markov chains may be modeled by finite state machines, and random walks provide a prolific example of their usefulness in mathematics. They arise broadly in statistical and information-theoretical contexts and are widely employed in economics, game theory, queueing (communication) theory, genetics, and finance. While it is possible to discuss Markov chains with any size of state space, the initial theory and most applications are focused on cases with a finite (or countably infinite) number of states.

## What are the types of Markov chains?

The system's state space and time parameter index need to be specified.

For countable state space:

• Discrete-time: Markov chain on a countable or finite state space
• Continuous-time: Markov process or Markov jump process

For continuous or general state space:

• Discrete-time: Markov chain on a measurable state space (for example, Harris chain)
• Continuous-time: Any continuous stochastic process with the Markov property (for example, the Wiener process)

## What are the properties of a Markov chain?

A variety of descriptions of either a specific state in a Markov chain or the entire Markov chain allow for better understanding of the Markov chain's behavior. Let PP be the transition matrix of Markov chain {X0, X1, … }.

• A state i has period k ≥ 1 if any chain starting at and returning to state i with positive probability must take a number of steps divisible by k. If k = 1, then the state is known as aperiodic, and if k > 1, the state is known as periodic. If all states are aperiodic, then the Markov chain is known as aperiodic.
• A Markov chain is known as irreducible if there exists a chain of steps between any two states that has positive probability.
• An absorbing state i is a state for which Pi,i = 1. Absorbing states are crucial for the discussion of absorbing Markov chains.
• A state is known as recurrent or transient depending upon whether or not the Markov chain will eventually return to it. A recurrent state is known as positive recurrent if it is expected to return within a finite number of steps, and null recurrent otherwise.
• A state is known as ergodic if it is positive recurrent and aperiodic. A Markov chain is ergodic if all its states are.

Irreducibility and periodicity both concern the locations a Markov chain could be at some later point in time, given where it started. Stationary distributions deal with the likelihood of a process being in a certain state at an unknown point of time. For Markov chains with a finite number of states, each of which is positive recurrent, an aperiodic Markov chain is the same as an irreducible Markov chain.

## What are popular applications of Markov chains?

Markov chains have many applications as statistical models of real-world processes, such as studying cruise control systems in motor vehicles, queues or lines of customers arriving at an airport, currency exchange rates and animal population dynamics.

It’s also important in many fields, such as:

• Physics
• Chemistry
• Biology
• Testing
• Speech recognition
• Information theory
• Queueing theory
• Internet applications
• Statistics
• Economics and finance
• Social sciences
• Games
• Music
• Baseball
• Markov text generators