C. information physics

Information physics: From energy to codes

P. Fraundorf

Abstract: We illustrate in terms familiar to modern day science students that: (i) an uncertainty slope mechanism underlies the usefulness of temperature via its reciprocal, which is incidentally around 42 [nats/eV] at the freezing point of water; (ii) energy over kT and differential heat capacity are ``multiplicity exponents'', i.e. the bits of state information lost to the environment outside a system per 2-fold increase in energy and temperature respectively; (iii) even awaiting description of ``the dice'', gambling theory gives form to the laws of thermodynamics, availability minimization, and net surprisals for measuring finite distances from equilibrium, information content differences, and complexity; (iv) heat and information engine properties underlie the biological distinction between autotrophs and heterotrophs, and life's ongoing symbioses between steady-state excitations and replicable codes; and (v) mutual information resources (i.e. correlations between structures e.g. a phenomenon and its explanation, or an organism and its niche) within and across six boundary types (ranging from the edges of molecules to the gap between cultures) are delocalized physical structures whose development is a big part of the natural history of invention. These tools might offer a physical framework to students of the code-based sciences when considering such disparate (and sometimes competing) issues as conservation of available work and the nurturing of genetic or memetic diversity.

References: