Information Theory

"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is, they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages" (Shannon, 1948).

Course Schedule and Reading

Bibliography

Suggested readings

Abstract: A new method of estimating the entropy and redundancy of a language is described. This method exploits the knowledge of the language statistics possessed by those who speak the language, and depends on experimental results in prediction of the next letter when the preceding text is known. Results of experiments in prediction are given, and some properties of an ideal predictor are developed.
Abstract: A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication.
Abstract: The method of types is one of the key technical tools in Shannon Theory, and this tool is valuable also in other fields. In this paper, some key applications will be presented insufficient detail enabling an interested nonspecialist to gain a working knowledge of the method, and a wide selection of further applications will be surveyed. These range from hypothesis testing and large deviations theory through error exponents for discrete memoryless channels and capacity of arbitrarily varying channels to multiuser problems. While the method of types is suitable primarily for discrete memoryless models, its extensions to certain models with memory will also be discussed.
Abstract: In this paper, the role of pattern matching in-formation theory is motivated and discussed. We describe the relationship between a pattern’s recurrence time and its prob-ability under the data-generating stochastic source. We show how this relationship has led to great advances in universal data compression. We then describe nonasymptotic uniform bounds on the performance of data-compression algorithms in cases where the size of the training data that is available to the encoder is not large enough so as to yield the asymptotic compression:the Shannon entropy. We then discuss applications of pattern matching and universal compression to universal prediction,classification, and entropy estimation.
Abstract: In many communication situations, the transmitter and the receiver must be designed without a complete knowledge of the probability law governing the channel over which transmission takes place. Various models for such channels and their corresponding capacities are surveyed. Special emphasis is placed on the encoders and decoders which enable reliable communication over these channels.
Abstract:The history of the theory and practice of quantization dates to 1948, although similar ideas had appeared in the literature as long ago as 1898. The fundamental role of quantization in modulation and analog-to-digital conversion was first recognized during the early development of pulse-code modulation systems, especially in the 1948 paper of Oliver,Pierce, and Shannon. Also in 1948, Bennett published the first high-resolution analysis of quantization and an exact analysis of quantization noise for Gaussian processes, and Shannon published the beginnings of rate distortion theory, which would provide a theory for quantization as analog-to-digital conversion and as data compression. Beginning with these three papers of fifty years ago, we trace the history of quantization from its origins through this decade, and we survey the fundamentals of the theory and many of the popular and promising techniques for quantization.

Scripts

>> tip for Windows 10 users: install Windows Subsystem for Linux.