## Lossless Compression Algorithm

1. (a) What is the entropy (η) of the image below, where numbers (0, 20, 50, 99) denote to the gray-level intensities?

99 |
99 |
99 |
99 |
99 |
99 |
99 |
99 |

20 |
20 |
20 |
20 |
20 |
20 |
20 |
20 |

0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |

0 |
0 |
50 |
50 |
50 |
50 |
0 |
0 |

0 |
0 |
50 |
50 |
50 |
50 |
0 |
0 |

0 |
0 |
50 |
50 |
50 |
50 |
0 |
0 |

0 |
0 |
50 |
50 |
50 |
50 |
0 |
0 |

0 |
0 |
0 |
0 |
0 |
0 |
0 |
0 |

(b) Show step by step how to construct the Huffman tree to encode the above four intensity values in this image. Show the resulting code for each intensity value.

(c) What is the average number of bits needed for each pixel, using your Huffman code? How does it compare toη?

2. (a) What are the advantages is [A, B, C], and the known probability distribution is PA = 0.5, PB = 0.4, PC = 0.1. For simplicity, let’s also assume that both encoder and decoder know that the length of the messages is always 3, so there is no need for a terminator.

i. How many bits are needed to encode the message BBB by Huffman coding?

ii. How many bits are needed to encode the message BBB by arithmetic coding?

3. Consider the dictionary-based LZW compression algorithm. Suppose the alphabet is the set of symbols {0,1}. Show the dictionary (symbol sets plus associated codes) and outputs for LZW compression of the input

0 1 1 0 0 1 1