Distinctions between physical systems containing information and the information contained therein blur at quantum scale. Just as information and computing are bound by underlying physics, physical systems appear to obey constraints on the rates of information flow and processing. We examine signs of computational bottlenecks in physical systems and models. Particularly interesting is the potential role of complexity in emergence of classicality.
Entropy is the fundamental quantity of information theory and its extensions to computation, machine learning, thermodynamics, etc. Furthermore, some forms of entropy are stable in the infinite-dimensional limit, motivating their study in von Neumann and C* algebra settings. We study the mathematical properties of entropy, including fundamental inequalities, applications to information theory, and its growth or decay under noise and intervention.
A (the?) key challenge for quantum computing is to create non-classically complex processes at large scale. Unfortunately, today's quantum processors appear to incur increasing noise with increasing size. The stable regimes when scaling up such quantum computations appear to be classical or random. Theory suggests a third regime of large-scale, stable quantum dynamics. Nonetheless, realizing this regime has been a challenge. We seek hints in nature, including systems that appear to manifest high quantum complexity.
Despite the successes of physical modeling and machine learning, many of the world's most interesting systems still resist analytical or computational study. These systems typically include large numbers of strongly, non-linearly interacting components, yet available datasets are usually small and noisy. Nonetheless, there is strong intuition and evidence of underlying, deterministic, low-dimensional dynamics. We are particularly interested in how hidden commonalities between distinct systems may reveal more structure than would be apparent from the data alone.