Information Theory and Coding

Instructor Emre Telatar
Office INR 117
Phone +41 21 69 37693
Email emre.telatar@epfl.ch
Office Hours By appointment
Teaching Assistant Erixhen Sula
Phone +41 21 69 31229
Office INR 031
Email erixhen.sula@epfl.ch
Office Hours By appointment
Teaching Assistant Yunus İnan
Email yunus.inan@epfl.ch
Office Hours By appointment
Teaching Assistant Jean-Baptiste Cordonnier
Email jean-baptiste.cordonnier@epfl.ch
Office Hours By appointment
Teaching Assistant Pierre Quinton
Email pierre.quinton@epfl.ch
Teaching Assistant Simon Guilloud
Email simon.guilloud@epfl.ch
Lectures Monday 13:00 – 15:00 (Room: ELA 1)
Tuesday 13:00 – 15:00 (Room: ELA 2)
Exercises Tuesday 15:00 – 17:00 (Room: ELA 2)
Language: English
Credits : 7 ECTS

Course Information

See the course information.

Special Announcements

Midterm Exam

    • Tuesday 30 October, 13:15-16:00, Location: AAC 2 31.
    • The exam is closed book and notes, 1 two-sided A4 cheat sheet is allowed.
    • Histogram of midterm grades can be found here: midterm_histo

Graded Homework

    • Due Monday, December 3. You can prepare handwritten or typewritten sheets for the solutions.

Final Exam

Detailed Schedule

Date Topics Covered Assignment Solutions Remarks
Sep 17 Public Holiday
Sep 18 – Introduction to Source Coding
– Non-singular codes
– Uniquely decodable codes
– Prefix-free codes
– Kraft’s inequality for prefix-free codes
– Kraft’s sum for products of codes
– Kraft’s inequality for non-singular codes
hw1.pdf sol1.pdf
Sep 24 – Kraft’s inequality for uniquely decodable codes
– Reverse Kraft’s inequality
– Expected codeword length
– Entropy as a lower-bound to the expected codeword length
– Existence of prefix-free codes with expected length at most entropy + 1
Sep 25 – Properties of optimal codes
– Huffman procedure
– Equivalence of prefix-free codes and strategies for guessing via binary questions
hw2.pdf sol2.pdf
Oct 01 – Entropy of multiple random variables
– Source coding by describing words of n letters
– Conditional entropy and mutual information
– Chain Rules for entropy and mutual information
Oct 02 – Short reminder for Markov Chains
– Minimal and maximal values of H(U)
– Data Processing inequality
– Entropy Rate
– Entropy rate of Stationary Processes
– Coding Theorem for Stationary Sources
– Fixed-to-Fixed Length Source Codes
– Typicality
– Properties of Typical Sets
– Asymptotic Equipartition Property
hw3.pdf sol3.pdf
Oct 08 – Properties of Typical Sets (cont.)
– Source Coding via Typicality
– Finite state machine (FSM) compressors
– Invertible machines, information lossless (IL) machines
Oct 09 – Lower bound to the output length for IL FSMs
– Universal coding example for iid binary sources
– Lempel-Ziv data compression algorithm
– analysis and competitive optimality of Lempel-Ziv with respect to FSMs.
– asymptotic optimality of Lempel-Ziv
hw4.pdf sol4.pdf Notes on Lempel-Ziv
Oct 15 – Communication channels
– memoryless/stationary channels, communication without feedback
– C(W) = max I(X;Y), examples with binary erasure channel and binary symmetric channels
Oct 16 – Fano’s inequality
– Converse Part of Channel Coding Theorem
hw5.pdf sol5.pdf
Oct 22 -Computation of C(W), KKT conditions
Oct 23 – Channel coding theorem
– Proofs by random constructions
hw6.pdf sol6.pdf Some old midterm exams

midterm2017
mid_sol2017
midterm2015
mid_sol2015

Oct 29 – Proof of the channel coding theorem via random codes and typicality decoder
Oct 31 – Midterm midterm solutions
Nov 05 -Feinstein’s Theorem
Nov 06 -Channels with input cost
-Coding theorem for channels with cost
hw7.pdf sol7.pdf
Nov 12 -Differential Entropy
-Properties of Differential Entropy
-Gaussian Channel
Nov 13 -Parallel Gaussian Channels, water-filling
-Bandlimited AWGN Channels
hw8.pdf sol08.pdf
Nov 19 – Hamming code
– Hamming weight
– Hamming distance
Nov 20 – Sphere-packing bound
– Gilbert bound
– Singleton bound
– Linear codes
– Generator and parity check matrices
hw9.pdf

Graded Homework

sol9.pdf

Graded homework sol

Notes on Coding

coding.pdf

Nov 26 – Polar Codes
Nov 27 – Polar Codes hw10.pdf sol10.pdf
Dec 3 – Polar Codes
Dec 4 – Rate-Distortion Theory (converse part) hw11.pdf sol11.pdf
Dec 10 – Rate-Distortion Theory (achievability part)
Dec 11 – Distributed Source Coding (Slepian-Wolf) hw12.pdf sol12.pdf
Dec 17 – Multiple Access Channel (MAC)
Dec 18 – Multiple Access Channel (MAC) hw13.pdf sol13.pdf
Jan 29 – Final Exam final solutions

Textbook

Additional Reading Material

To Review Basic Probability Theory