This course introduces the fundamentals of information theory and discusses applications in compression and transmission of data. Measures of information, including entropy, and their properties are derived. The limits of lossless data compression are derived and practical coding schemes approaching the theoretical limits are presented. Lossy data compression tradeoffs are discussed in terms of the rate-distortion framework. The concept of reliable communication through noisy channels (channel capacity) is developed. Techniques for practical channel coding, including block and convolutional codes, are also covered.
Prerequisites
background in probability and random processes such as in ECE502 or equivalent