INF242: Information Theory

Entropy

A nice article on the notion of 'surprise' in relation to 'information content'.

Anecdote concerning the origins of information-theoretic 'entropy' through a conversation between Shannon and von Neumann.

Chapter 1 of these notes serves as a good resource for anyone interested in the statistical mechanics version of 'entropy'.

Wiki entry on the relation between information-theoretic and thermodynamic entropy. This book on the topic also looks interesting.

Podcasts and Videos

"My Favourite Theorem" |  Episodes 41 and 47 are dedicated to Fano's theorem and channel coding !

"The Bit Player" (2018)     |  A film about Claude Shannon. Here's a trailer.

Other

Quanta article on Markov chains, randomness and '15 puzzles'.

Markov, Shannon and language-generation models.