Classical and Quantum Information Theory

Information theory originates at the end of the Second World War with the work of Claude Shannon, an engineer working on communication engineering, and, since then, it has informed major advances in digital technologies, from wireless connectivity to artificial intelligence. Information theory plays a key role also in the sciences, in which it provides a useful framework to understand and explain physical phenomena from the microscopic scope of quantum mechanics to the macroscopic scale of theromodynamics. The importance of information theory is only expected to grow as the language of information becomes increasingly relevant for engineers and scientists using machine learning and data analytics tools.

This page collects notes that aim at providing an intuitive, and yet formal, introduction to information measures, covering both classical and quantum systems. The scope of the notes encompasses static classical information measures, namely the classical and quantum Shannon, Renyi, and Tsallis entropy metrics, as well as their associated divergences; dynamic transformation of classical and quantum systems and their impact on information measures; classical correlations evaluated in terms of local and global uncertainties, as well as in terms of residual uncertainties; classical and quantum correlations in quantum systems including entanglement and discord; and multipartite classical and quantum correlations studied through the lens of tensor networks.

Drafts will be uploaded and updated as they become ready. Feedback is appreciated and can be sent at osvaldo.simeone_at_kcl.ac.uk.


This material supports Classical and Quantum Information Theory for Engineers and Scientists by Osvaldo Simeone, © 2025 Simeone [in preparation] 

Cite as: O. Simeone, ``Classical and Quantum Information Theory for Engineers and Scientists,'' in preparation (https://sites.google.com/view/osvaldosimeone/cqit).