Logo
Pattern

Discover published sets by community

Explore tens of thousands of sets crafted by our community.

Information Theory Basics

25

Flashcards

0/25

Still learning
StarStarStarStar

Entropy

StarStarStarStar

A measure of the unpredictability or randomness of a system, typically denoted as H(X)H(X) for a discrete random variable X.

StarStarStarStar

Channel Capacity

StarStarStarStar

The maximum rate at which information can be reliably transmitted over a communication channel, represented as CC and given by the Shannon-Hartley theorem.

StarStarStarStar

Shannon's Noisy Channel Coding Theorem

StarStarStarStar

A fundamental theorem stating that there exists a coding technique that allows the transmission of data over a noisy channel at rates up to the channel capacity with an arbitrarily small error probability.

StarStarStarStar

Mutual Information

StarStarStarStar

The amount of information that one random variable contains about another random variable, often denoted as I(X;Y)I(X;Y).

StarStarStarStar

Redundancy

StarStarStarStar

The portion of information that is not necessary to understand the content, often a result of repeating patterns or predictability.

StarStarStarStar

Noise

StarStarStarStar

The unwanted alterations in the signal during transmission, which may lead to errors in the received data.

StarStarStarStar

Source Coding

StarStarStarStar

The process of encoding information from a source in a way that removes redundancy, making it more compact without losing essential information.

StarStarStarStar

Signal-to-Noise Ratio (SNR)

StarStarStarStar

A measure used to compare the level of a desired signal to the level of background noise, often expressed in decibels (dB).

StarStarStarStar

Hamming Distance

StarStarStarStar

The number of positions at which the corresponding symbols are different in two strings of equal length.

StarStarStarStar

Nyquist Rate

StarStarStarStar

The minimum sampling rate, twice the highest frequency present in the signal, required to reconstruct a continuous signal from its samples without loss of information, given by 2B2B where BB is the highest frequency.

StarStarStarStar

Differential Entropy

StarStarStarStar

The continuous counterpart of entropy for continuous random variables, measuring the randomness or unpredictability.

StarStarStarStar

Lossy Compression

StarStarStarStar

A data compression method that uses inexact approximations for representing the content that may not allow for a perfect reconstruction but results in smaller file sizes.

StarStarStarStar

Error Detection and Correction

StarStarStarStar

Techniques in digital communications to detect and correct errors in data transmission and storage.

StarStarStarStar

Data Compression

StarStarStarStar

The process of encoding information using fewer bits, which may involve lossless or lossy methods.

StarStarStarStar

Shannon-Hartley Theorem

StarStarStarStar

A formula to calculate the maximum rate of information (CC) that can be transmitted over a communications channel of a specified bandwidth in the presence of noise, given as

C=Blog2(1+SN)C = B \log_2(1 + \frac{S}{N})
where BB is the bandwidth, SS is the signal power, and NN is the noise power.

StarStarStarStar

Coding Gain

StarStarStarStar

The improvement in signal-to-noise ratio due to the implementation of an error-correcting code, contributing to more reliable transmission.

StarStarStarStar

Rate-Distortion Theory

StarStarStarStar

A framework in information theory that deals with the trade-off between the data rate of a source and the distortion (or accuracy) of the reconstructed data.

StarStarStarStar

Automatic Repeat Request (ARQ)

StarStarStarStar

A protocol for error control in data transmission, where the receiver detects errors and requests retransmission of corrupted data.

StarStarStarStar

Shannon's Source Coding Theorem

StarStarStarStar

The theorem states that a source can be compressed to its entropy H(X)H(X) (in bits) without loss of information, defining the fundamental limit of lossless data compression.

StarStarStarStar

Lossless Compression

StarStarStarStar

A data compression algorithm that allows the original data to be perfectly reconstructed from the compressed data.

StarStarStarStar

Information Rate

StarStarStarStar

The speed at which information is transmitted over a channel, typically measured in bits per second (bps).

StarStarStarStar

Block Code

StarStarStarStar

A method of encoding data in fixed-size blocks to enable error detection and correction for reliable transmission.

StarStarStarStar

Convolutional Code

StarStarStarStar

An error-correcting code where the encoded output is a function of the current and previous input bits, leading to overlapping code words that are processed sequentially.

StarStarStarStar

Forward Error Correction (FEC)

StarStarStarStar

A system of error control for data transmission that uses error-correcting codes to detect and correct errors at the receiver without the need for retransmission.

StarStarStarStar

Markov Chain

StarStarStarStar

A stochastic model that describes a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

Know
0
Still learning
Click to flip
Know
0
Logo

© Hypatia.Tech. 2024 All rights reserved.