Papers
* email for draft
Counterfactual dependence and probabilistic dependence are two criteria frequently used to analyze causation. "Mere correlations'' - instances of probabilistic dependence and counterfactual independence - are a well-studied class of cases where these criteria diverge. In this essay, I provide an example of the opposite type of divergence: counterfactual dependence and probabilistic independence. The butterfly effect of chaos theory says that had a butterfly in the distant past not flapped its wings, but everything else was identical, it is possible (and indeed probable) that a present tornado would not have occurred. However, the math of chaos also tells us that whether or not the butterfly flaps its wings, the probability of the tornado is the same. I show how these two claims fit together, highlighting the distinct and unorthodox counterfactual origin of probabilistic independence in chaotic systems. Examining the case under different theories of causation, I find widespread disagreement about whether the butterfly's flap causes the tornado. I argue that this disagreement can be explained by an underlying semantic indeterminacy in our ordinary conception of causation. Rather than being exceptional, we should expect these types of relationships, and thus indeterminacies, to predominate in chaotic systems over long timescales.
Chaos theory is a branch of classical physics, discovered in the 1960s-70s, where solutions are sensitively dependent on their initial conditions. For many, it is surprising that chaos theory was discovered so late. However, through the work of Henri Poincaré, we can see that much of the math of chaos was understood by some 70 years prior. Furthermore, through the writings of Poincaré’s colleagues — Jacques Hadamard and Pierre Duhem — we also see a detailed understanding of the chaos found in his work. They also have explicit reasons of why the math of chaos was to be ignored. It was a strict form of empiricism — positivism — causing chaos to be labeled as “useless” and “meaningless” mathematics because it was thought to be ungrounded in experience. In this paper, I describe how the empiricist tenets of positivism exiled chaos from physics following Poincaré.
Classical chaos is frequently claimed to pose a problem for quantum-classical correspondence. Recent work on quantum decoherence purportedly solves this problem. This essay attempts to reconstruct these claims. When comparing quantum and classical distributions, it is argued that classical chaos poses no problem for quantum mechanics, even in the absence of quantum decoherence. By restricting our attention to relevant physical details - actions, timescales, and the limited measurement resolution of classical observables - we find that quantum distributions do not appreciably diverge from their classical, chaotic counterparts over relevant timescales, even without decoherence. This point has been obscured in the literature by an overemphasis on mathematical limits - namely the hbar -> 0 and t -> infinity limits - rather than realistic ranges of physical parameters. The real problem posed by chaos for quantum-classical correspondence is that chaos is a large natural channel for generating macroscopic quantum superpositions. Thus, the problem of quantum chaos is intimately tied up with the quantum measurement problem. The way this problem shows up, and the way decoherence "solves'' it, is idiosyncratic to different interpretations of quantum mechanics. I illustrate this point using the Everett and Bohmian interpretations.
Causation as a Cognitively Anchored Concept*
Here are three ideas familiar to those who study causation:
(1) Our folk concept of causation becomes indeterminate in many exotic examples, particularly in cases far outside the domains we are used to.
(2) Causation plays a fundamental role in cognition.
(3) Causation resists precise conceptual analysis or metaphysical reduction.
Of these, only (1) is contentious. In this essay, I argue for (1) and show how the conjunction of (1) and (2) explains (3). Nature has not given us a concept of causation with precise boundaries or determinate content. This is quite commonplace. We ordinarily have a way of resolving these indeterminacies through specification, stipulation, and linguistic negotiation. However, causation is anchored to fundamental tracking mechanisms in cognition. This makes it hard to redefine or clarify as we learn more about the world. It is this combination of properties — it is both ill-defined and resistant to redefinition – that thwarts attempts to give a univocal criteria of causation.
The Long Arm of the Laws: a Case Against Revisionist Counterfactual History*
David Lewis' account of counterfactuals says that the closest possible worlds will exactly match the actual world's history up until a small, localized miracle occurs which allows the antecedent to be true. Recent accounts by Barry Loewer, David Albert, and Cian Dorr suggest that that the closest worlds are ones that only approximately match the actual world's history, but keep the laws intact. I provide a counterexample of these recent accounts; assuming the world is relativistic, they allow counterfactual dependence between space-like separated events which should not be able to influence one another at all. They allow this dependence via backwards-to-forwards influence: changing a present event deterministically changes the initial state of the universe, which in turn implies changes to what happens in spatially distant regions in the present.
No Thermalization without Quantum Representation (with Kabir Bakshi)
In the philosophy of statistical mechanics, two competing factions have emerged for how to understand the theory. Boltzmannian statistical mechanics attempts to ground macroscopic thermodynamic behavior in the behavior of the microstates of an underlying phase space. Gibbsian statistical mechanics instead explains this behavior via probability densities over phase space. In this paper, we argue that while this division is neatly fitted to the domain of classical statistical mechanics, it is ill equipped to the study of quantum statistical mechanics. We explain this in the context of the thermalization and fluctuation behavior of quantum systems that start far from equilibrium. Ontological differences between the Boltzmannian and Gibbsian approaches to thermalization in classical stat mech turn out to be merely semantic differences in quantum stat mech. Additionally, pure and mixed state quantum stat mech systems feature both the Gibbsian notion of fluctuation (differences from the equilibrium value upon projective measurement) and the Boltzmannian notion of fluctuation (the dynamical fluctuation of individual systems away from equilibrium). Thus, in ontologically reifying an outdated classical picture of reality, this central debate in the foundations of statistical mechanics is shown to be non-projectable into the more fundamental quantum realm.