Introduction to Information Theory (Fall 2021)

Program: Bachelor Mathematics, Informatics, AI, Physics & Astronomy (WI, WInf, IN, KI, NS)
Lecturer: Michael Walter
Teaching assistants: Yachen Liu, Misho Yanakiev
Schedule: Nov-Dec 2021 (period 2)
Further information: DataNose , Canvas

The Canvas course page is the primary source for all course material. This page is offered only for your convenience.

Course description

This course will give an introduction to information theory information theory – the mathematical theory of information. Ever since its inception, information theory has had a profound impact on society. It underpins important technological developments, from reliable memories to mobile phone standards, and its versatile mathematical toolbox has found use in computer science, machine learning, physics, and even pure mathematics.

Starting from probability theory, we will discuss how to mathematically model information sources and communication channels, how to optimally compress information, and how to design error-correcting codes that allow us to reliably communicate over noisy communication channels. We will also see how techniques used in information theory can be applied more generally to make predictions from noisy data.

See also last year’s course homepage for an impression of what we will cover in this course.

We will discuss rigorous definitions and proofs, hence some “mathematical maturity” will be assumed. We will use basic probability theory throughout the course (and will remind you of the most important facts). In addition, part of the homework will require programming in Python.

Lecture notes and video recordings

Video recordings are available on Canvas.

Our main reference (which is optional, since handwritten lecture notes and video recordings will be provided) is David J. C. MacKay’s beautiful textbook “Information Theory, Inference, and Learning Algorithms”, Cambridge (2003). It is available online 📖 and contains much more material than what we will be able to discuss in class.

Homework

Practice Problems

In addition, we will offer many practice problems (one set for each exercise class) so that you can test your understanding as the course progresses.

Learning objectives

At the end of the course, you will be able to:

We will check off these objectives as we move along in the course.

Format

Lectures, exercises classes, self-study, and a short presentation.

Assessment

The final grade will be determined by the following calculation:

60% exam grade + 30% homework grade + 10% presentation grade

The same rule applies for the re-sit exam.

There will be one homework problem set per week (in total 6), posted on the course homepage by Monday. You must submit your completed homework on Canvas before Monday the week after. We will ask you to collaborate and submit in groups of 3-4 students. The solutions will be discussed in the Wednesday exercise class (among other things). Assignments will be accepted late only if you have extenuating circumstances (such as sickness or family emergency) and provided you confirm with the lecturer before the deadline. Your problem set with the lowest score will be ignored (this includes any problem set you did not submit).

Instead of having a midterm exam, we would like you to read about a topic in information theory and give a short presentation to your peers. You will present as a group of 3-4 students; everyone should speak for a few minutes. We will give you many suggestions for topics (on Canvas) but you are free to pick your own topic (but please confirm it with us).

Everything that we discussed in class and on the homework is in principle in scope for the exam. We recommend that you revisit the homework problems and the exercises as a preparation. You are allowed to bring one self-prepared “cheat sheet” to the exam (A4 paper, hand-written, you can use both sides).