Explore tens of thousands of sets crafted by our community.
Information Theory Basics
25
Flashcards
0/25
Entropy
A measure of the unpredictability or randomness of a system, typically denoted as for a discrete random variable X.
Channel Capacity
The maximum rate at which information can be reliably transmitted over a communication channel, represented as and given by the Shannon-Hartley theorem.
Shannon's Noisy Channel Coding Theorem
A fundamental theorem stating that there exists a coding technique that allows the transmission of data over a noisy channel at rates up to the channel capacity with an arbitrarily small error probability.
Mutual Information
The amount of information that one random variable contains about another random variable, often denoted as .
Redundancy
The portion of information that is not necessary to understand the content, often a result of repeating patterns or predictability.
Noise
The unwanted alterations in the signal during transmission, which may lead to errors in the received data.
Source Coding
The process of encoding information from a source in a way that removes redundancy, making it more compact without losing essential information.
Signal-to-Noise Ratio (SNR)
A measure used to compare the level of a desired signal to the level of background noise, often expressed in decibels (dB).
Hamming Distance
The number of positions at which the corresponding symbols are different in two strings of equal length.
Nyquist Rate
The minimum sampling rate, twice the highest frequency present in the signal, required to reconstruct a continuous signal from its samples without loss of information, given by where is the highest frequency.
Differential Entropy
The continuous counterpart of entropy for continuous random variables, measuring the randomness or unpredictability.
Lossy Compression
A data compression method that uses inexact approximations for representing the content that may not allow for a perfect reconstruction but results in smaller file sizes.
Error Detection and Correction
Techniques in digital communications to detect and correct errors in data transmission and storage.
Data Compression
The process of encoding information using fewer bits, which may involve lossless or lossy methods.
Shannon-Hartley Theorem
A formula to calculate the maximum rate of information () that can be transmitted over a communications channel of a specified bandwidth in the presence of noise, given as
Coding Gain
The improvement in signal-to-noise ratio due to the implementation of an error-correcting code, contributing to more reliable transmission.
Rate-Distortion Theory
A framework in information theory that deals with the trade-off between the data rate of a source and the distortion (or accuracy) of the reconstructed data.
Automatic Repeat Request (ARQ)
A protocol for error control in data transmission, where the receiver detects errors and requests retransmission of corrupted data.
Shannon's Source Coding Theorem
The theorem states that a source can be compressed to its entropy (in bits) without loss of information, defining the fundamental limit of lossless data compression.
Lossless Compression
A data compression algorithm that allows the original data to be perfectly reconstructed from the compressed data.
Information Rate
The speed at which information is transmitted over a channel, typically measured in bits per second (bps).
Block Code
A method of encoding data in fixed-size blocks to enable error detection and correction for reliable transmission.
Convolutional Code
An error-correcting code where the encoded output is a function of the current and previous input bits, leading to overlapping code words that are processed sequentially.
Forward Error Correction (FEC)
A system of error control for data transmission that uses error-correcting codes to detect and correct errors at the receiver without the need for retransmission.
Markov Chain
A stochastic model that describes a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
© Hypatia.Tech. 2024 All rights reserved.