EC2036 INFORMATION THEORY L T P C
3 0 0 3
UNIT I QUANTITATIVE STUDY OF INFORMATION 8
Basic inequalities, Entropy, Kullback-Leibler distance, Mutual information, Bounds on
entropy, Fisher information , Cramer Rao inequality, Second law of thermodynamics ,
Sufficient statistic , Entropy rates of a Stochastic process
UNIT II CAPACITY OF NOISELESS CHANNEL 8
Fundamental theorem for a noiseless channel ,Data compression , Kraft inequality ,
Shannon-Fano codes , Huffman codes , Asymptotic equipartition , Rate distortion theory
UNIT III CHANNEL CAPACITY 9
Properties of channel capacity , Jointly typical sequences , Channel Coding Theorem,
converse to channel coding theorem, Joint source channel coding theorem ,
UNIT IV DIFFERENTIAL ENTROPY AND GAUSSIAN CHANNEL 9
AEP for continuous random variables, relationship between continuous and discrete
entropy, properties of differential entropy, Gaussian channel definitions, converse to
coding theorem for Gaussian channel, channels with colored noise, Gaussian channels
with feedback .
UNIT V NETWORK INFORMATION THEORY 11
Gaussian multiple user channels , Multiple access channel , Encoding of correlated
sources , Broadcast channel , Relay channel , Source coding and rate distortion with
side information , General multi-terminal networks.
TOTAL= 45 PERIODS
1. Elements of Information theory – Thomas Cover, Joy Thomas : Wiley 1999
REFERENCE1. Information theory, inference & learning algorithms – David Mackay year?