Explore tens of thousands of sets crafted by our community.
Markov Chains Fundamentals
10
Flashcards
0/10
Steady-State Distribution
A probability distribution over the states of a Markov chain where the probabilities do not change over time and the chain is said to be in equilibrium.
Chapman-Kolmogorov Equations
Equations that relate the probabilities of transitioning across a chain of states over different time steps in a Markov chain.
Markov Chain
A Markov chain is a stochastic process that describes a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Transition Probability Matrix
A matrix describing the probabilities of moving from one state to another in a Markov chain, where each entry represents the probability of transitioning from one state to another.
Absorbing State
A state in a Markov chain that, once entered, cannot be left. The probability of transitioning to any other state from an absorbing state is zero.
Ergodic Markov Chain
An ergodic Markov chain is one where every state can be reached from every other state, and every state is aperiodic.
Markov Property
The property of a stochastic process where the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.
Periodicity of a State
In a Markov chain, the periodicity of a state is the greatest common divisor (GCD) of the lengths of all possible cycles that can be made from that state to itself.
Irreducible Markov Chain
A Markov chain is called irreducible if it is possible to get to any state from any state in a finite number of steps.
State Space
In Markov chains, the state space is the set of all possible states in which the process could be.
© Hypatia.Tech. 2024 All rights reserved.