CLR cover

Additional Problem 8: Huffman Codes vs. Entropy (Preliminary version)

\( \def\Huf{Huf} \)

Assume we have $n$ items with positive weights (probabilities) $p_1,p_2,\ldots,p_n$ and $\sum_i\, p_i=1$. Denote $\bar{p}=(p_1,p_2,\ldots,p_n)$. The Huffman algorithm constructs a binary tree with items assigned to its leaves. Let $l_i$ be the depth (number of edges from the root) to $p_i$. The average length of the Huffman code is $\Huf(\bar{p})=\sum_i\, p_i\cdot l_i$.

An important concept in information theory is entropy. The vector $\bar{p}$ is treated as a source of information and its entropy is $$E(\bar{p})\,=\, -\sum_i p_i\cdot \log_2 p_i.$$

Show that $E(\bar{p})\le Huf(\bar{p})\le E(\bar{p})+1$.

Use the following fact: if $p_1,\ldots,p_n$, $q_1,\ldots,q_n$ are two sequences of positive integers with the same sum, then $-\sum_i\,p_i \log q_i\geq -\sum_i\,p_i \log p_i$. It follows easily from the inequality $\log x \le x-1$, for $x\gt 0$.

References