An Introduction to Single-User Information Theory

An Introduction to Single-User Information Theory

The reliable transmission and processing of information bearing signals over a noisy communication channel is at the heart of what we call communication. Information theory—founded by Claude E. Shannon in 1948 [340]—provides a mathematical framework for the theory of communication. It describes the fundamental limits to how efficiently one can encode information and still be able to recover it at the destination either with negligible loss or within a prescribed distortion threshold.

The purpose of this textbook is to present a concise, yet mathematically rigorous, introduction to the main pillars of information theory. It thus naturally focuses on the foundational concepts and indispensable results of the subject for single-user systems, where a single data source or message needs to be reliably processed and communicated over a noiseless or noisy point-to-point channel. The book consists of five meticulously drafted core chapters (with accompanying problems), emphasizing the key topics of information measures, lossless and lossy data compression, channel coding, and joint source–channel coding. Two appendices covering necessary and supplementary material in real analysis and in probability and stochastic processes are included and a comprehensive instructor’s solutions manual is separately available. The book is well suited for a single-term first course on information theory, ranging from 10 to 15 weeks, offered to senior undergraduate and entry-level graduate students in mathematics, statistics, engineering, and computing and information sciences.

The textbook grew out of lecture notes we developed while teaching information theory over the last 20 years to students in applied mathematics, statistics, and engineering at our home universities. Over the years teaching the subject, we realized that standard textbooks, some of them undeniably outstanding, tend to cover a large amount of material (including advanced topics), which can be overwhelming and inaccessible to debutant students. They also do not always provide all the necessary technical details in the proofs of the main results. We hope that this book fills these needs by virtue of being succinct and mathematically precise and that it helps beginners acquire a profound understanding of the fundamental elements of the subject.

The textbook aims at providing a coherent introduction of the primary principles of single-user information theory. All the main Shannon coding theorems are proved in full detail (without skipping important steps) using a consistent approach based on the law of large numbers, or equivalently, the asymptotic equipartition property (AEP). A brief description of the topics of each chapter follows.

  • Chapter 2: Information measures for discrete systems and their properties: self-information, entropy, mutual information and divergence, data processing theorem, Fano’s inequality, Pinsker’s inequality, simple hypothesis testing, the Neyman–Pearson lemma, the Chernoff–Stein lemma, and Rényi’s information measures.
  • Chapter 3: Fundamentals of lossless source coding (data compression): discrete memoryless sources, fixed-length (block) codes for asymptotically lossless compression, AEP, fixed-length source coding theorems for memoryless and stationary ergodic sources, entropy rate and redundancy, variable-length codes for lossless compression, variable-length source coding theorems for memoryless and stationary sources, prefix codes, Kraft inequality, Huffman codes, Shannon–Fano–Elias codes, and Lempel–Ziv codes.
  • Chapter 4: Fundamentals of channel coding: discrete memoryless channels, block codes for data transmission, channel capacity, coding theorem for discrete memoryless channels, example of polar codes, calculation of channel capacity, channels with symmetric structures, lossless joint source–channel coding, and Shannon’s separation principle.
  • Chapter 5: Information measures for continuous alphabet systems and Gaussian channels: differential entropy, mutual information and divergence, AEP for continuous memoryless sources, capacity and channel coding theorem of discrete-time memoryless Gaussian channels, capacity of uncorrelated parallel Gaussian channels and the water-filling principle, capacity of correlated Gaussian channels, non-Gaussian discrete-time memoryless channels, and capacity of band-limited (continuous-time) white Gaussian channels.
  • Chapter 6: Fundamentals of lossy source coding and joint source–channel coding: distortion measures, rate–distortion theorem for memoryless sources, rate–distortion theorem for stationary ergodic sources, rate–distortion function and its properties, rate–distortion function for memoryless Gaussian and Laplacian sources, lossy joint source–channel coding theorem, and Shannon limit of communication systems.
  • Appendix A: Overview on suprema and limits.
  • Appendix B: Overview in probability and random processes. Random variables and stochastic processes, statistical properties of random processes, Markov chains, convergence of sequences of random variables, ergodicity and laws of large numbers, central limit theorem, concavity and convexity, Jensen’s inequality, Lagrange multipliers, and the Karush–Kuhn–Tucker (KKT) conditions for constrained optimization problems.

免费获取本书PDF

扫描下方二维码添加微信号 bookyage 告知所需电子书的书名即可,我们会尽快(一般24小时之内)将相应文件以百度网盘链接的形式发送给您。

注意:每位用户只能免费获取1本书的pdf,超过1本之后需要付费才能继续收到相应的pdf,您可以以微信转账方式付款,所有PDF统一定价每本9.9元人民币。

扫一扫添加微信号bookyage