A source produces three symbols A, B and C with probabilities, P(
| A source produces three symbols A, B and C with probabilities, P(A) = ½, P(B) = ¼ and P(C) = ¼. The source entropy is
A. ½ bit/symbol
B. 1 bit/symbol
C. 1 ¼ bits/symbol
D. 1 ½ bits/symbol
Please scroll down to see the correct answer and solution guide.
Right Answer is: D
SOLUTION
Concept:
The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.
It is calculated as:
\(H=\underset{i=1}{\overset{n}{\mathop \sum }}\,{{p}_{i}}{{\log }_{2}}\left( \frac{1}{{{p}_{i}}} \right)bits/symbol\)
pi is the probability of the occurrence of a symbol.
Calculation:
Given:
P(A) = ½, P(B) = ¼ and P(C) = ¼
The entropy will be:
\(H= \frac{1}{2} {\log _2}\left( 2 \right) + \frac{1}{{4}}{\log _2}4 + \frac{1}{{4}}{\log _2}4\)
\(H= \frac{1}{2} + \frac{2}{{4}} + \frac{2}{{4}}\)
H = 6/4 = 1 ½ bits/symbol