The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory.
Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level.
Antoine Chambert-Loir is a professor of mathematics at Université Paris Cité. His research addresses questions in algebraic geometry which are motivated by number theoretical problems. He is the author of two books published by Springer-Verlag: A Field Guide To Algebra, an introduction to Galois theory; and (Mostly) Commutative Algebra, an intermediate-level exposition of commutative algebra. With J. Nicaise and J. Sebag, he cowrote the research monograph Motivic Integration (published by Birkhäuser), which was awarded the 2017 Ferran Sunyer i Balaguer prize.