Online Learning Platform

Information Theory and Coding > Entropy > What is Information rate?

Information rate

If the source is emitting symbols at a fixed rate of rs  symbols / sec, the average source information rate ‘R’ is defined as

 R = rs* H bits / sec

 

Example:

An analog signal is band limited to B Hz, sampled at the Nyquist rate, and the samples are quantized into 4-levels. The quantization levels Q1, Q2, Q3, and Q4 (messages) are assumed independent and occur with probs. P1 = P2 = 1/8 and P2 = P3 = 3/8 . Find the information rate of the source.

By definition, the average information H is given by

H= 1.8 bits/ message

Information rate of the source by definition is R = rs H

R = 2B* (1.8) = (3.6 B) bits/sec

Prev
Average Information Content of Symbols
Next
Joint Entropy and Conditional Entropy
Feedback
ABOUT

Statlearner


Statlearner STUDY

Statlearner