Selected Papers

An accuracy-based approach to quantum conditionalization

A core tenet of Bayesian epistemology is that rational agents update by conditionalization. Accuracy arguments in favour of this norm are well known. Meanwhile, scholars working in quantum probability and quantum state estimation have proposed multiple updating rules, all of which look prima facie like analogues of Bayesian conditionalization. The most common are Lüders conditionalization and Bayesian mean estimation (BME). Some authors also endorse a lesser-known alternative that we call retrodiction. We show how one can view Lüders and BME as complementary rules, and we give expected accuracy and accuracy dominance arguments for both. By contrast, we find that retrodiction is accuracy-dominated, at least on many measures of accuracy.

Hypothetical frequencies as approximations

Hájek (Erkenntnis 70(2):211–235, 2009) argues that probabilities cannot be the limits of relative frequencies in counterfactual infinite sequences. I argue for a different understanding of these limits, drawing on Norton’s (Philos Sci 79(2):207–232, 2012) distinction between approximations (inexact descriptions of a target) and idealizations (separate models that bear analogies to the target). Then, I adapt Hájek’s arguments to this new context. These arguments provide excellent reasons not to use hypothetical frequencies as idealizations, but no reason not to use them as approximations.

One world is (probably) just as good as many

One of our most sophisticated accounts of objective chance in quantum mechanics involves the Deutsch-Wallace theorem, which uses state-space symmetries to justify agents’ use of the Born rule when the quantum state is known. But Wallace argues that this theorem requires an Everettian approach to measurement. I find that this argument is unsound. I demonstrate a counter-example by applying the Deutsch-Wallace theorem to the de Broglie-Bohm pilot-wave theory.

Is the classical limit "singular"? (with B. Feintzeig)

We argue against claims that the classical ℏ→0 limit is "singular" in a way that frustrates an eliminative reduction of classical to quantum physics.  We show one precise sense in which quantum mechanics and scaling behavior can be used to recover classical mechanics exactly, without making prior reference to the classical theory. To do so, we use the tools of strict deformation quantization, which provides a rigorous way to capture the ℏ→0 limit. We then use the tools of category theory to demonstrate one way that this reduction is explanatory: it illustrates a sense in which the structure of quantum mechanics determines that of classical mechanics.

Extensions of bundles of C*-algebras (with B. Feintzeig)

Bundles of C*-algebras can be used to represent limits of physical theories whose algebraic structure depends on the value of a parameter. The primary example is the ℏ→0 limit of the C*-algebras of physical quantities in quantum theories, represented in the framework of strict deformation quantization. In this paper, we understand such limiting procedures in terms of the extension of a bundle of C*-algebras to some limiting value of a parameter. We prove existence and uniqueness results for such extensions. Moreover, we show that such extensions are functorial for the C*-product, dynamical automorphisms, and the Lie bracket (in the ℏ→0 case) on the fiber C*-algebras.

Probabilism for stochastic theories

I defend an analog of probabilism that characterizes rationally coherent estimates for chances. Specifically, I demonstrate the following accuracy-dominance result for stochastic theories in the C*-algebraic framework: supposing an assignment of chance values is possible if and only if it is given by a pure state on a given algebra, your estimates for chances avoid accuracy-dominance if and only if they are given by a state on that algebra. When your estimates avoid accuracy-dominance (roughly: when you cannot guarantee that other estimates would be more accurate), I say that they are sufficiently coherent. In formal epistemology and quantum foundations, the notion of rational coherence that gets more attention requires that you never allow for a sure loss (or “Dutch book”) in a given sort of betting game; I call this notion full coherence. I characterize when these two notions of rational coherence align, and I show that there is a quantum state giving estimates that are sufficiently coherent, but not fully coherent.

Two forms of inconsistency in quantum foundations (with N. Teh)

Recently, there has been some discussion of how Dutch Book arguments might be used to demonstrate the rational incoherence of certain hidden variable models of quantum theory (Feintzeig and Fletcher 2017). In this paper, we argue that the “form of inconsistency” underlying this alleged irrationality is deeply and comprehensively related to the more familiar “inconsistency” phenomenon of contextuality. Our main result is that the hierarchy of contextuality due to Abramsky and Brandenburger (2011) corresponds to a hierarchy of additivity/convexity-violations which yields formal Dutch Books of different strengths. We then use this result to provide a partial assessment of whether these formal Dutch Books can be interpreted normatively.