Assume we have $n$ items with positive weights (probabilities) $p_1,p_2,\ldots,p_n$ and $\sum_i\, p_i=1$. Denote $\bar{p}=(p_1,p_2,\ldots,p_n)$. The Huffman algorithm constructs a binary tree with items assigned to its leaves. Let $l_i$ be the depth (number of edges from the root) to $p_i$. The average length of the Huffman code is $\Huf(\bar{p})=\sum_i\, p_i\cdot l_i$.
An important concept in information theory is entropy. The vector $\bar{p}$ is treated as a source of information and its entropy is $$E(\bar{p})\,=\, -\sum_i p_i\cdot \log_2 p_i.$$