Course Learning Outcomes:
- Computing Shannon's information measures (entropy, Kullback-Leibler distance and mutual information).
- Computing the capacity of communication channels.
- Reasoning about the properties of Shannon's information measures (entropy, Kullback-Leibler distance and mutual information).
- Using mathematical tools to infer properties of coding and communication systems.
- Working with probabilistic modeling of communication systems for source and channel coding purposes.
- Using tools from probability theory to analyze communication systems.
- Working with metric assessment of data compression code designs.