Workshop: Constructing Quantum Theories

Dates: Thurs, April 29- Sun, May 2 , 2021

Location: Zoom (email bfeintze@uw.edu for access)

Organizers: Ben Feintzeig, Jeremy Steeger, Kade Cicchella

Support: We gratefully acknowledge support from the National Science Foundation, the UW Philosophy Department, the Saari Endowment, and the Robinson HPS Initiative.

Speakers

  • Klaas Landsman (Radboud University Nijmegen, Mathematical Physics)

  • Pierre Bieliavsky (Louvain, Mathematics)

  • Michel Janssen (Minnesota, History of Science)

  • Maria Papageorgiou (Waterloo, Physics)

  • Kasia Rejzner (York, Mathematics)

  • Michael Miller (Toronto, Philosophy)

  • Alex Blum (Berlin, Max Planck Institute)

  • Adam Koberinski (Western, Philosophy)

  • Stephen Sharpe (Washington, Physics)

  • Stefan Waldmann (Würzburg, Mathematics)

  • Jeremy Steeger (Washington, Philosophy)

  • Doreen Fraser (Waterloo, Philosophy)

Discussions led by:

  • Laura Ruetsche (Michigan, Philosophy)

  • Ben Feintzeig (Washington, Philosophy)

Schedule

Schedule for Constructing Quantum Theories Workshop,

Titles & Abstracts

  • Klaas Landsman (Radboud University Nijmegen, Mathematical Physics) [slides, recording]

Title: Quantization: The Big Picture

Abstract: The Big Picture arose in the 1990s from my efforts to relate Mackey's quantization theory based on systems of imprimitivity (which Mackey himself saw as the natural implementation of what he called Weyl's Program, i.e. the construction of the basic operators of quantum mechanics from group-theoretical considerations, to deformation quantization, and hence to the tradition started by Dirac, as continued by Groenewold, Moyal, Berezin, Flato, Rieffel, and others. The Big Picture is technically based on the theory of Lie groupoids and Lie algebroids. These traditions were originally both powerful but almost disjoint, and their fusion provides a complete and satisfying view of the idea of quantization or “quantum-mechanical reinterpretation” that Heisenberg started in 1925.

Literature: KL, Foundations of Quantum Theory (Springer Open Access, 2017), http://www.springer.com/gp/book/9783319517766, especially Chapter 7.


  • Pierre Bieliavsky (Louvain, Mathematics) [recording]

Title: Deformation quantization, differential geometry and Lie theory.

Abstract: In this talk, I will survey some of my work in deformation quantization/noncommutative geometry. For several years until now, I have mostly been studying the implication in noncommutative geometry of Lie theory within a symplectic/Poisson context: Lie groups, homogeneous spaces, symmetric spaces, noncommutative harmonic analysis. For this, I use a geometric approach to non-formal deformation quantization. I am interested in how homogeneous symplectic geometry guides quantum mechanics, and more generally quantization in various symmetric situations: noncommutative homogeneous spaces (à la A. Connes), non-formal Drinfel'd twists, locally compact quantum groups. In my work, I certainly use formal deformation quantization techniques (star products), but my goals are definitely in operator algebras (C*, Fréchet). I will explain these notions and present examples.


Title: From discreteness to non-commutativity: quantum conditions, 1900–1927

Abstract: I give a brief overview of the development of quantum conditions from Planck to Heisenberg. In his work on black-body radiation, Planck introduced his famous constant and sowed the seeds of the phase-integral approach Sommerfeld later used to generalize Bohr’s model of the hydrogen atom. Schwarzschild connected this approach to some powerful techniques from celestial mechanics. Within a decade, however, the Bohr-Sommerfeld theory ran into insurmountable difficulties. In his famous Umdeutung [= reinterpretation] paper, Heisenberg showed that the way out of the impasse was to replace single-component quantities occurring in the classical laws by many-component quantities, which Born and Jordan soon recognized to be matrices. Following this Umdeutung procedure, Heisenberg, Born and Jordan rewrote the basic quantum condition of the old quantum theory in the form of the now familiar commutation relations for position and momentum.


Title: Detector models for QFT: Superluminal signaling and ‘impossible measurements’

Abstract: Non-relativistic quantum mechanics has a standard ideal measurement theory. While there is no consensus about how to interpret this measurement theory, the recipe for extracting probabilistic predictions for measurement outcomes from the theory is uncontroversial. In contrast, articulating the ideal measurement theory for relativistic quantum field theory (QFT) is far from straightforward. Sorkin's 1993 paper ''Impossible measurements on quantum fields'' aims to show that the natural generalization of the non-relativistic measurement scheme to relativistic quantum theory fails because it entails superluminal signaling. In this talk, I will first clarify the logical structure of Sorkin's no-go result and analyze the various responses. From this perspective, I will motivate the construction of detector models, that is, the introduction of 'local' probe systems that are thought of as implementing local measurements on the quantum field. This can be consistently achieved in the context of algebraic QFT by modeling the probe using relativistic QFT, as in the recently proposed Fewster and Verch framework. Another type of detector model is the Unruh-DeWitt model that is commonly used in the field of Relativistic Quantum Information. In this case, due to the non-relativistic nature of the detector system, the adherence of the model to the relativistic premises of the underlying QFT is not guaranteed. I will analyze two kinds of acausal behavior of Unruh-DeWitt-type detector models. I will show in what way spatially extended non-relativistic detector models predict superluminal propagation of the field's initial conditions. Finally, I will connect this characteristic of non-relativistic detector models to Sorkin's no-go result. This work has been done in collaboration with Prof. Doreen Fraser, Prof. Eduardo Martin-Martinez and Jose de Ramon Rivera.


Title: Locality and non-locality in quantum field theory and beyond

Abstract: In this talk I want to discuss various notions of locality present in quantum field theory. I will also give examples of situations where non-local effects are present. Next, I will say how things change if we wish to go beyond quantum field theory and start talking about quantum gravity.


Title: Whence Distributions?

Abstract: The physical notion of a local quantum field can be expressed naturally in terms of operator valued distributions. However, a systematic theory of such mathematical objects was not available until the development of the framework of quantum field theory was already well underway. In this talk I will consider the influence this contingent historical fact had on the course of the development of quantum field theory. I will trace the origins of the mathematical theory of distributions and its subsequent integration into quantum field theory both in Wightman's axiomatic framework, and in what eventually became the causal perturbation theory program. I will conclude by considering the implications of this history for contemporary interpretive debates.


Title: Deconstructing QFT

Abstract: In the late 1940s, quantum electrodynamics became an empirically successful physical theory through the development of covariant renormalization methods and the calculation of Lamb Shift and anomalous magnetic moment of the electron. In my talk, I discuss the immediate aftermath of this breakthrough, looking at the attempts to build on this success and turn quantum field theory (QFT) into a mathematically consistent, universal framework for microscopic physics. I describe the failure of these attempts (in particular by Freeman Dyson and the Cambridge School of Nicholas Kemmer) and the subsequent discovery of potential inconsistencies in the QFT framework, such as Gunnar Källén's proof that the UV divergences of QED are non-perturbative in nature and the Landau Pole. I discuss the reactions to these developments and argue that we can observe here a greater variety of attitudes toward the mathematical consistency of QFT than suggested by the usual axiomatic-phenomenological dichotomy.


Title: Constructing Effective Field Theories in Particle Physics

Abstract: The modern understanding of quantum theories on a continuum is that they are effective field theories (EFTs), useful for a limited range of energy or distance scales. This framework is used primarily in particle physics and condensed matter physics, though the insights have extended beyond this, to general relativity and classical field theories as well. Though the framework may be widely applicable, there are important differences between EFTs in each context. In this talk I will discuss the construction and use of EFTs in particle physics. Though the EFT framework is more broadly applicable, the often discussed top-down EFT construction is better suited to the context of condensed matter physics. In order to understand the full range of applications in particle physics, it is better to take a bottom-up perspective. This has implications for the way one thinks of the cutoff, so-called “bare” parameters, and coarse graining in particle physics.


Title: View from the front line: simulations of quantum chromodynamics and the continuum limit

Abstract: Quantum chromodynamics (QCD) is the quantum field theory (QFT) that purportedly describes the strong interactions. The strongest evidence for this broadly accepted claim now comes from numerical simulations of QCD, discretized on a space-time lattice (with time being imaginary), and quantized using the Feynman path integral. A crucial question for the existence of this QFT is whether the continuum limit exists, i.e. whether one can extrapolate to vanishing lattice spacing. After a brief recap of QCD and the strong interactions, I outline the theoretical arguments that support the existence of the continuum limit, which are based on the property of asymptotic freedom observed in QCD when analyzed in perturbation theory, and show the status of numerical tests of this limit. I also mention another important limit that must be controlled, namely the limit of infinite volume, which is required since simulations are necessarily done in finite volume. I close with some comments on the status of the continuum limit in generic QFTs.


Title: Convergence and Continuity of Star Products

Abstract: In this overview I will discuss the need of convergence in formal deformation quantization. Several different approaches exist in the literature, each with its own benefits and decificts. In the recent years some new developments took place: the formal series are required to converge in suitable locally convex topologies leading to continuity of the star product. While not a general scheme, many classes of examples can put into this picture. I will also comment on further properties of the algebras obtained this way, in particular their *-representations and questions of self-adjointness.


Title: Is the classical limit 'singular'?

Abstract: Within the foundations of physics, there is a tradition of arguing that the transition from quantum to classical mechanics involves singular limits—and that these singular limits scuttle the reduction of the former theory to the latter (Berry, 1994; Batterman, 2002; Bokulich, 2008). This argument challenges the notion, popularized by Nickles (1973), that the mathematical structure of a successor theory reduces to that of its predecessor in a given domain by taking the limit of some parameter. Using the tools of strict deformation quantization, we argue contra Berry, Batterman, and Bokulich that the classical limit is an exemplar of reduction in roughly Nickles’s sense. We argue that deformation quantization provides a clear and useful sense in which the classical limit is both smooth and instructive. The quantization procedure specifies a bundle of different algebras of observables, some more quantum than others, and a limiting process by which the quantum algebras reduce to a classical algebra. This limiting process is smooth: the quantum algebras converge uniformly to the classical one. It is, moreover, instructive: from the quantum algebras alone, we can completely and uniquely recover the classical algebra. We establish each of these latter points using new technical results within the deformation-quantization framework.


Title:The use of formal analogies in model construction

Abstract: One of the advantages of a formal mathematical presentation of a theory is that similarities in mathematical structure to other theories can be revealed. This allows mathematical problem solving strategies developed for use in one theory to be exported to another theory. An example of this heuristic in action was the application of Euclidean functional integral techniques in constructive QFT in the 1970s. Very difficult problems constructing models for relativistic QFT were translated into more tractable problems involving Euclidean fields, including models of classical statistical mechanics. At about the same time, Wilson’s formulation of renormalization group methods for QFT was also inspired by analytic continuation to classical statistical mechanics and analogies to critical phenomena. I will offer some reflections on differences between the methods used in these two model construction projects, including the role played by different types of formal analogies and the use of scaling transformations to secure either mathematical or physical equivalence.