Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Measure-Theoretic Probability

19

Flashcards

0/19

Still learning
StarStarStarStar

Lp-Spaces

StarStarStarStar

The LpL^p-space is a vector space of equivalence classes of measurable functions for which the pp-th power of the absolute value is integrable, defined by the norm fp=(fpdP)1/p||f||_p = ( \int |f|^p dP )^{1/p}.

StarStarStarStar

Independence

StarStarStarStar

Two events AA and BB in a probability space are independent if P(AB)=P(A)P(B)P(A \cap B) = P(A)P(B). For random variables, independence means the joint distribution factors into the product of the marginals.

StarStarStarStar

Dominated Convergence Theorem

StarStarStarStar

The Dominated Convergence Theorem states that if {fn}\{f_n\} are measurable, fnff_n \to f a.e., and there exists an integrable function gg such that fng|f_n| \leq g a.e., then limnfndP=fdP\lim_{n \to \infty} \int f_n dP = \int f dP.

StarStarStarStar

Probability Space

StarStarStarStar

A probability space is a measure space (Ω,F,P)(\Omega, \mathcal{F}, P) where Ω\Omega is the sample space, F\mathcal{F} is a sigma-algebra of events, and PP is a probability measure with P(Ω)=1P(\Omega) = 1.

StarStarStarStar

Law of Total Probability

StarStarStarStar

If {Bn}n=1\{B_n\}_{n=1}^{\infty} is a countable partition of the sample space Ω\Omega, then for any event AA in Ω\Omega, P(A)=nP(ABn)P(Bn)P(A) = \sum_{n} P(A|B_n)P(B_n) given that each P(Bn)>0P(B_n) > 0.

StarStarStarStar

Borel-Cantelli Lemma

StarStarStarStar

If {An}\{A_n\} is a sequence of events in a probability space, then if the sum of the probabilities is finite, i.e., n=1P(An)<\sum_{n=1}^{\infty} P(A_n) < \infty, the probability that infinitely many of them occur is 0.

StarStarStarStar

Conditional Probability

StarStarStarStar

Given two events AA and BB with P(B)>0P(B) > 0, the conditional probability P(AB)P(A|B) is defined as P(AB)P(B)\frac{P(A \cap B)}{P(B)}.

StarStarStarStar

Variance

StarStarStarStar

The variance of a random variable XX is a measure of the spread of its distribution, defined as Var(X)=E[(XE[X])2]Var(X) = E[(X - E[X])^2].

StarStarStarStar

Expected Value

StarStarStarStar

The expected value of a random variable XX, denoted E[X]E[X], is the integral ΩXdP\int_{\Omega} X dP in a probability space (Ω,F,P)(\Omega, \mathcal{F}, P), providing a measure of the 'center' of the distribution of XX.

StarStarStarStar

Sigma-Algebra

StarStarStarStar

A sigma-algebra F\mathcal{F} over a set Ω\Omega is a collection of subsets of Ω\Omega that includes the empty set, is closed under complementation, and is closed under countable unions.

StarStarStarStar

Random Variable

StarStarStarStar

A random variable is a measurable function from a probability space into a measurable space, typically (Ω,F)(R,B(R))(\Omega, \mathcal{F}) \mapsto (\mathbb{R}, \mathcal{B}(\mathbb{R})), where B(R)\mathcal{B}(\mathbb{R}) is the Borel sigma-algebra on the real numbers.

StarStarStarStar

Fatou's Lemma

StarStarStarStar

Fatou's Lemma states that for any sequence of non-negative measurable functions {fn}\{f_n\}, lim infnfndPlim infnfndP\int \liminf_{n \to \infty} f_n dP \leq \liminf_{n \to \infty} \int f_n dP.

StarStarStarStar

Fubini's Theorem

StarStarStarStar

Fubini's Theorem allows the interchange of integration order for product spaces, stating that if ff is a measurable function on X×YX \times Y, then X(Yf(x,y)dQ(y))dP(x)=X×Yf(x,y)d(P×Q)(x,y)=Y(Xf(x,y)dP(x))dQ(y)\int_X (\int_Y f(x,y) dQ(y)) dP(x) = \int_{X \times Y} f(x,y) d(P \times Q)(x,y) = \int_Y (\int_X f(x,y) dP(x)) dQ(y), provided the integrals are absolutely convergent.

StarStarStarStar

Convergence in Probability

StarStarStarStar

A sequence of random variables XnX_n converges in probability to a random variable XX if for every ϵ>0\epsilon > 0, limnP(XnX>ϵ)=0\lim_{n \to \infty} P(|X_n - X| > \epsilon) = 0.

StarStarStarStar

Convergence in Distribution

StarStarStarStar

A sequence of random variables XnX_n converges in distribution to a random variable XX if for every continuity point xx of the distribution function FXF_X, limnFXn(x)=FX(x)\lim_{n \to \infty} F_{X_n}(x) = F_X(x), where FXnF_{X_n} and FXF_X are the respective cumulative distribution functions.

StarStarStarStar

Probability Measure

StarStarStarStar

A probability measure PP is a function from a sigma-algebra F\mathcal{F} to the interval [0,1][0,1] such that P(Ω)=1P(\Omega) = 1, it is countably additive, and P()=0P(\emptyset) = 0.

StarStarStarStar

Almost Sure Convergence

StarStarStarStar

A sequence of random variables XnX_n converges almost surely to a random variable XX if P(limnXn=X)=1P(\lim_{n \to \infty} X_n = X) = 1. It implies that the set of ω\omega for which Xn(ω)X_n(\omega) does not converge to X(ω)X(\omega) has probability zero.

StarStarStarStar

Monotone Convergence Theorem

StarStarStarStar

The Monotone Convergence Theorem states that if {fn}\{f_n\} is a sequence of non-negative measurable functions increasing to ff, then limnfndP=fdP\lim_{n \to \infty} \int f_n dP = \int f dP.

StarStarStarStar

Radon-Nikodym Theorem

StarStarStarStar

The Radon-Nikodym Theorem states that if PP and QQ are two sigma-finite measures on a space (Ω,F)(\Omega, \mathcal{F}), and QQ is absolutely continuous with respect to PP, there exists a measurable function ff such that Q(A)=AfdPQ(A) = \int_A f dP for all AFA\in \mathcal{F}.

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.