Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Markov Chains Fundamentals

10

Flashcards

0/10

Still learning
StarStarStarStar

Steady-State Distribution

StarStarStarStar

A probability distribution over the states of a Markov chain where the probabilities do not change over time and the chain is said to be in equilibrium.

StarStarStarStar

Chapman-Kolmogorov Equations

StarStarStarStar

Equations that relate the probabilities of transitioning across a chain of states over different time steps in a Markov chain.

StarStarStarStar

Markov Chain

StarStarStarStar

A Markov chain is a stochastic process that describes a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

StarStarStarStar

Transition Probability Matrix

StarStarStarStar

A matrix describing the probabilities of moving from one state to another in a Markov chain, where each entry represents the probability of transitioning from one state to another.

StarStarStarStar

Absorbing State

StarStarStarStar

A state in a Markov chain that, once entered, cannot be left. The probability of transitioning to any other state from an absorbing state is zero.

StarStarStarStar

Ergodic Markov Chain

StarStarStarStar

An ergodic Markov chain is one where every state can be reached from every other state, and every state is aperiodic.

StarStarStarStar

Markov Property

StarStarStarStar

The property of a stochastic process where the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it.

StarStarStarStar

Periodicity of a State

StarStarStarStar

In a Markov chain, the periodicity of a state is the greatest common divisor (GCD) of the lengths of all possible cycles that can be made from that state to itself.

StarStarStarStar

Irreducible Markov Chain

StarStarStarStar

A Markov chain is called irreducible if it is possible to get to any state from any state in a finite number of steps.

StarStarStarStar

State Space

StarStarStarStar

In Markov chains, the state space is the set of all possible states in which the process could be.

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.