Instructor  Emre Telatar 
Office  INR 117 
Phone  +41 21 69 37693 
emre.telatar@epfl.ch  
Office Hours  By appointment 
Teaching Assistant  Mani Bastani Parizi 
Phone  +41 21 69 31359 
Office  INR032 
mani.bastaniparizi@epfl.ch  
Office Hours  By appointment 
Teaching Assistant  Rafah ElKhatib 
Phone  +41 21 69 31361 
Office  INR 032 
rafah.elkhatib@epfl.ch  
Office Hours  By appointment 
Lectures  Monday  13:00 – 15:00 (Room: ELA 1) 
Tuesday  13:00 – 15:00 (Room: ELA 2)  
Exercises  Tuesday  15:00 – 17:00 (Room: ELA 2) 
Language:  English  
Credits :  7 ECTS 
See the course information.
Special Announcements
Midterm Exam
Your midterm exam will take place on Tuesday October 27, at 13:15 in rooms ELA2, DIA003, and CO120.

Please find your room and seat before the exam according to the following seating plan:

This is a closed book exam but you are allowed to have one A4 page of notes (doublesided) with you.
Here is some statistics about midterm grades:
Average  Standard Deviation  

P1  6.39  3.72 
P2  9.51  4.45 
P3  5.09  4.09 
Total  21  9.18 
You can also find a histogram of the grades here.
Final Exam
Your final exam will take place on Monday, January 11, 2016, at 8:15 in rooms INM200 and INM202.

Please be there well in advance to find your room and seat.

This is a closed book exam, but you are allowed to bring two A4 pages of notes (doublesided) with you.
Detailed Schedule
Date  Topics Covered  Assignment  Solutions  Remarks  

Sep 14  – Introduction to Source Coding – Nonsingular codes – Uniquely decodable codes – Prefixfree codes – Kraft’s inequality for prefixfree codes – Kraft’s inequality for extensions of codes 

Sep 15  – Kraft’s inequality for uniquely decodable codes – Reverse Kraft’s inequality – Expected codeword length – Entropy as a lowerbound to the expected codeword length 
hw1.pdf  sol1.pdf  
Sep 21  Public Holiday  
Sep 22  – Existence of prefixfree codes with average length at most entropy + 1 – Entropy of multiple random variables – Properties of optimal codes – Huffman procedure 
hw2.pdf  sol2.pdf  
Sep 28  – Equivalence of prefixfree codes and strategy for guessing via binary questions – Interpretation of entropy as expected number of questions for guessing the random variable – Conditional Entropy and Mutual Information – Chain Rules for entropy and mutual information 

Sep 29  – Review of Markov Chains – Data Processing inequality – Entropy Rate – Entropy rate of Stationary Processes 
hw3.pdf  sol3.pdf  
Oct 5  – Coding Theorem for Stationary Sources – FixedtoFixed Length Source Codes – Typicality 

Oct 6  – Properties of Typical Sets – Asymptotic Equipartition Property – Source Coding by AEP – VariabletoFixed Length Source Codes – Valid and PrefixFree Dictionaries 
hw4.pdf  sol4.pdf  
Oct 12  – Relationship between word and letterentropies for valid, prefixfree dictionaries – Tunstall procedure – Analysis of Tunstall procedure 
Notes on Tunstall Coding  
Oct 13  – Universal Source Coding – Lempel–Ziv method – Analysis of Lempel–Ziv (finite state machine compressions, FSM compressibility of a sequence, LZ compressibility of a sequence) 
hw5.pdf  sol5.pdf  
Oct 19  – InformationLossless FSM Compressors – Lower bound on the output length of an IL FSM Compressor 
Notes on Lempel–Ziv Compression  
Oct 20  – LZ Compressibility of sequences – Optimality of Lempel–Ziv – Communication Channels – Discrete Memoryless Channels 
hw6.pdf  sol6.pdf  
Oct 26  – Examples of Discrete Memoryless Channels (BSC and BEC) – Transmission with or without feedback – Channel Capacity – Fano’s Inequality 

Oct 27  Midterm Exam  midterm.pdf  midsol.pdf  
Nov 2  – Converse to the Channel Coding Theorem  
Nov 3  – Proof of the Channel Coding Theorem  hw7.pdf  sol7.pdf  
Nov 9  – Capacity of BSC and BEC – Jensen’s Inequality – Concavity of Mutual Information in Input Distribution – KKT Conditions 

Nov 10  – KKT Conditions (cont’d) – Application of KKT: Capacity of Z Channel – Continuous Alphabet: Differential Entropy 
hw8.pdf  sol8.pdf  
Nov 16  – Properties of differential entropy – Entropytypical seqeuences – Quantization – Entropy of Gaussian distribution 

Nov 17  – Capacity under cost constraint – Capacity of AWGN – Converse to the channel coding theorem with cost constraint – Parallel Gaussian channels (waterfilling) 
hw9.pdf  sol9.pdf  
Nov 23  – Proof of Channel Coding Theorem for general channels via Threshold Decoding  
Nov 24  – Channel Codes – Minimum Distance – Singleton Bound – Spherepacking Bound – Gilbert–Varshamov Bound 
hw10.pdf  sol10.pdf  
Nov 30  – Linear Codes – Generator Matrix – Paritycheck Matrix – Hamming Codes 

Dec 1  – Reed–Solomon Codes  hw11.pdf  sol11.pdf  
Dec 7  – Polar Codes  
Dec 8  – Polar Codes  hw12.pdf  sol12.pdf  
Dec 14  – Polar Codes  
Dec 16  – Rate–Distortion Theory  
Jan 11  Final Exam  final.pdf  finalsol.pdf 
Textbook

T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley 2006.
Additional Reading Material

C. E. Shannon, A Mathematical Theory of Communications, Bell System Technical Journal, 1948.

Notes on Tunstall Coding.

Notes on LempelZiv Algorithm
To Review Basic Probability Theory

K. L. Chung, A Course in Probability Theory, Academic Press.

W. Feller, An Introduction to Probability Theory and Its Applications, Wiley.

G. Grimmett and D. Stirzaker, Probability and Random Processes, Oxford University Press.

A. Papoulis, Probability, Random Variables, and Stochastic Processes, McGrawHill.

S. M. Ross, A First Course in Probability, Pearson Education.