Explore tens of thousands of sets crafted by our community.
Coding Theory Basics
12
Flashcards
0/12
Information Rate
The rate at which information can be transmitted over a communication channel, measured in bits per unit time.
Channel Capacity
The tightest upper bound on the amount of information that can be reliably transmitted over a communications channel.
Maximum Likelihood Decoding
A method of decoding a block of encoded data to choose the most likely original data.
Parity Bit
A bit that is added to a group of bits to ensure that the number of bits with the value one is even or odd.
Hamming Code
A set of error-correction code that can detect up to two-bit errors or correct one-bit errors without detection of uncorrected errors.
Binary Code
A coding system using the binary digits 0 and 1 to represent a letter, digit, or other character in a computer or other electronic device.
Error Detection
The process of identifying and correcting errors in data transmission or storage.
Cyclic Redundancy Check (CRC)
An error-detecting code commonly used in digital networks and storage devices to detect accidental changes to raw data.
Block Code
A coding scheme in which a fixed-length block of bits is encoded into a fixed-length block of n bits, where n is greater than the original block length.
Coding Gain
The measure of the improvement in signal-to-noise ratio of a communication system from the use of coding.
Hamming Distance
The number of positions at which the corresponding symbols are different in two strings of equal length.
Shannon's Theorem
A theorem in information theory which states that for any given degree of noise contamination of a communication channel, it is possible to transmit information nearly error-free up to a computable maximum rate through the channel.
© Hypatia.Tech. 2024 All rights reserved.