Papers
* email for draft
Counterfactual dependence and probabilistic dependence are two criteria frequently used to analyze causation. "Mere correlations'' - instances of probabilistic dependence and counterfactual independence - are a well-studied class of cases where these criteria diverge. In this essay, I provide an example of the opposite type of divergence: counterfactual dependence and probabilistic independence. The butterfly effect of chaos theory says that had a butterfly in the distant past not flapped its wings, but everything else was identical, it is possible (and indeed probable) that a present tornado would not have occurred. However, the math of chaos also tells us that whether or not the butterfly flaps its wings, the probability of the tornado is the same. I show how these two claims fit together, highlighting the distinct and unorthodox counterfactual origin of probabilistic independence in chaotic systems. Examining the case under different theories of causation, I find widespread disagreement about whether the butterfly's flap causes the tornado. I argue that this disagreement can be explained by an underlying semantic indeterminacy in our ordinary conception of causation. Rather than being exceptional, we should expect these types of relationships, and thus indeterminacies, to predominate in chaotic systems over long timescales.
Blindspots of Empiricism in the Discovery of Chaos Theory
Studies in History and Philosophy of Science (2026)
Chaos theory is a branch of classical physics, discovered in the 1960s-70s, where solutions are sensitively dependent on their initial conditions. For many, it is surprising that chaos theory was discovered so late. However, through the work of Henri Poincaré, we can see that much of the math of chaos was understood by some 70 years prior. Furthermore, through the writings of Poincaré’s colleagues — Jacques Hadamard and Pierre Duhem — we also see a detailed understanding of the chaos found in his work. They also have explicit reasons of why the math of chaos was to be ignored. It was a strict form of empiricism — positivism — causing chaos to be labeled as “useless” and “meaningless” mathematics because it was thought to be ungrounded in experience. In this paper, I describe how the empiricist tenets of positivism exiled chaos from physics following Poincaré.
Quantum Chaos, Measurement, and the Many Faces of Correspondence
Synthese (Forthcoming)
Classical chaos is frequently claimed to pose a problem for quantum-classical correspondence. Recent work on quantum decoherence purportedly solves this problem. This essay attempts to rationally reconstruct these claims. When comparing quantum and classical distributions, it is argued that classical chaos poses no problem for quantum mechanics, even in the absence of quantum decoherence. By restricting our attention to relevant physical details --- actions, timescales, and the limited measurement resolution of classical observables --- we find that quantum distributions do not appreciably diverge from their classical, chaotic counterparts over relevant timescales, even without decoherence. This point has been obscured in the literature by an inattention to realistic physical parameters and measurement capacities. The remaining problem posed by chaos for quantum-classical correspondence is that chaos is a large natural channel for generating macroscopic quantum superpositions. Thus, the problem of quantum chaos is deeply tied up with the quantum measurement problem. The way in which this problem shows up, and the way in which decoherence ``solves'' it, is idiosyncratic to different interpretations of quantum mechanics. I illustrate this point using the Everett and Bohmian interpretations.
No Thermalization without Quantum Representation (with Kabir S. Bakshi)*
In the philosophy of statistical mechanics, two competing factions have emerged for how to understand the theory. Boltzmannian statistical mechanics attempts to ground macroscopic thermodynamic behavior in the behavior of the microstates of an underlying phase space. Gibbsian statistical mechanics instead explains this behavior via probability densities over phase space. In this paper, we argue that while this division is neatly fitted to the domain of classical statistical mechanics, it is ill equipped to the study of quantum statistical mechanics. We explain this in the context of the thermalization and fluctuation behavior of quantum systems that start far from equilibrium, as demonstrated in recent work on the Eigenstate Thermalization Hypothesis. Differences between the Boltzmannian and Gibbsian approaches to thermalization in classical stat mech turn out to be merely semantic in quantum stat mech. Additionally, pure and mixed state quantum stat mech systems feature both the Gibbsian notion of fluctuation (differences from the equilibrium value upon projective measurement) and the Boltzmannian notion of fluctuation (the dynamical fluctuation of individual systems away from equilibrium). Thus, this central distinction, and the surrounding debate, in the foundations of statistical mechanics is shown to be non-projectable into quantum domain.