Information Theory and Coding

 

Instructor Emre Telatar
Office INR 117
Phone +41 21 69 37693
Email emre.telatar@epfl.ch
Office Hours By appointment
   
   
Teaching Assistant Mani Bastani Parizi
Phone +41 21 69 31359
Office INR032
Email mani.bastaniparizi@epfl.ch
Office Hours By appointment
   
   
Teaching Assistant Rafah El-Khatib
Phone +41 21 69 31361
Office INR 032
Email rafah.el-khatib@epfl.ch
Office Hours By appointment
   
   

 

Lectures Monday 13:00 – 15:00 (Room: ELA 1)
  Tuesday 13:00 – 15:00 (Room: ELA 2)
Exercises Tuesday 15:00 – 17:00 (Room: ELA 2)

 

Language:   English
Credits :   7 ECTS

 

See the course information.

Special Announcements

 

Midterm Exam

Your midterm exam will take place on Tuesday October 27, at 13:15 in rooms ELA2, DIA003, and CO120.

  • Please find your room and seat before the exam according to the following seating plan:
    • If your last name starts with A to KAR please go to room ELA2.
    • If your last name starts with KAS to P please go to room DIA003
    • If your last name starts with R to Z please go to room CO120
  • This is a closed book exam but you are allowed to have one A4 page of notes (double-sided) with you.

Here is some statistics about midterm grades:

  Average Standard Deviation
P1 6.39 3.72
P2 9.51 4.45
P3 5.09 4.09
Total 21 9.18

You can also find a histogram of the grades here.

Final Exam

Your final exam will take place on Monday, January 11, 2016, at 8:15 in rooms INM200 and INM202.

  • Please be there well in advance to find your room and seat.
  • This is a closed book exam, but you are allowed to bring two A4 pages of notes (double-sided) with you.

Detailed Schedule

Date Topics Covered Assignment Solutions Remarks  
Sep 14 – Introduction to Source Coding
– Non-singular codes
– Uniquely decodable codes
– Prefix-free codes
– Kraft’s inequality for prefix-free codes
– Kraft’s inequality for extensions of codes
       
Sep 15 – Kraft’s inequality for uniquely decodable codes
– Reverse Kraft’s inequality
– Expected codeword length
– Entropy as a lower-bound to the expected codeword length
hw1.pdf sol1.pdf    
Sep 21       Public Holiday :-)  
Sep 22 – Existence of prefix-free codes with average length at most entropy + 1
– Entropy of multiple random variables
– Properties of optimal codes
– Huffman procedure
hw2.pdf sol2.pdf    
Sep 28 – Equivalence of prefix-free codes and strategy for guessing via binary questions
– Interpretation of entropy as expected number of questions for guessing the random variable
– Conditional Entropy and Mutual Information
– Chain Rules for entropy and mutual information
       
Sep 29 – Review of Markov Chains
– Data Processing inequality
– Entropy Rate
– Entropy rate of Stationary Processes
hw3.pdf sol3.pdf    
Oct 5 – Coding Theorem for Stationary Sources
– Fixed-to-Fixed Length Source Codes
– Typicality
       
Oct 6 – Properties of Typical Sets
– Asymptotic Equipartition Property
– Source Coding by AEP
– Variable-to-Fixed Length Source Codes
– Valid and Prefix-Free Dictionaries
hw4.pdf sol4.pdf    
Oct 12 – Relationship between word- and letter-entropies for valid, prefix-free dictionaries
– Tunstall procedure
– Analysis of Tunstall procedure
    Notes on Tunstall Coding  
Oct 13 – Universal Source Coding
– Lempel–Ziv method
– Analysis of Lempel–Ziv (finite state machine compressions, FSM compressibility of a sequence, LZ compressibility of a sequence)
hw5.pdf sol5.pdf    
Oct 19 – Information-Lossless FSM Compressors
– Lower bound on the output length of an IL FSM Compressor
    Notes on Lempel–Ziv Compression  
Oct 20 – LZ Compressibility of sequences
– Optimality of Lempel–Ziv
– Communication Channels
– Discrete Memoryless Channels
hw6.pdf sol6.pdf    
Oct 26 – Examples of Discrete Memoryless Channels (BSC and BEC)
– Transmission with or without feedback
– Channel Capacity
– Fano’s Inequality
       
Oct 27 Midterm Exam midterm.pdf midsol.pdf    
Nov 2 – Converse to the Channel Coding Theorem        
Nov 3 – Proof of the Channel Coding Theorem hw7.pdf sol7.pdf    
Nov 9 – Capacity of BSC and BEC
– Jensen’s Inequality
– Concavity of Mutual Information in Input Distribution
– KKT Conditions
       
Nov 10 – KKT Conditions (cont’d)
– Application of KKT: Capacity of Z Channel
– Continuous Alphabet: Differential Entropy
hw8.pdf sol8.pdf    
Nov 16 – Properties of differential entropy
– Entropy-typical seqeuences
– Quantization
– Entropy of Gaussian distribution
       
Nov 17 – Capacity under cost constraint
– Capacity of AWGN
– Converse to the channel coding theorem with cost constraint
– Parallel Gaussian channels (water-filling)
hw9.pdf sol9.pdf    
Nov 23 – Proof of Channel Coding Theorem for general channels via Threshold Decoding        
Nov 24 – Channel Codes
– Minimum Distance
– Singleton Bound
– Sphere-packing Bound
– Gilbert–Varshamov Bound
hw10.pdf sol10.pdf    
Nov 30 – Linear Codes
– Generator Matrix
– Parity-check Matrix
– Hamming Codes
       
Dec 1 – Reed–Solomon Codes hw11.pdf sol11.pdf    
Dec 7 – Polar Codes        
Dec 8 – Polar Codes hw12.pdf sol12.pdf    
Dec 14 – Polar Codes        
Dec 16 – Rate–Distortion Theory        
Jan 11 Final Exam final.pdf finalsol.pdf    

Textbook

Additional Reading Material

To Review Basic Probability Theory