In the landscape of theories of laws of nature, one of the most popular approaches is the Best System Account, developed by David Lewis [Lewis, 1973, Lewis, 1983, Lewis, 1994]. On this descriptive account, the laws are the regularities that earn the status of axioms or theorems in the best system. The bestness of systems is evaluated for Lewis by the balance of two virtues: strength and simplicity. Following Lewis, a lot of different authors have started to develop and refine the original account in different ways in a flourishing contemporary literature (Eddon and Meacham, 2015] or [Hicks et al., 2023] offer a good overview of this diversity).
In this presentation, I want to focus on the virtues of evaluations (or criteria) that were criticised in the literature, especially the criteria of simplicity in [Woodward, 2014], but also [Loewer, 2024] or [Jaag and Loew, 2018]. I want to claim that some of the issues concerning the criteria of evaluation and their application can be solved by relying on pluralistic accounts, that allow multiple best systems for different sciences (an idea introduced in [Schrenk, 2006] and [Cohen and Callender, 2009]). More precisely, I want to argue that not only the criteria of evaluation will depend on the selected properties of the system (an idea sketched in [Schrenk, 2023]) but more generally that the selection of properties and the selection of criteria constrain each other. In short, we need a form of coherency between criteria and properties in the case of the best systems accounts. This coherency is an additional constraint for the pluralist accounts but it also allows them to solve a number of other problems about the vagueness and the application of the criteria.
Usually, reductionism is perceived as a threat to Scientific Pluralism. In the first part of the paper, I argue that the converse is true. Explanatory Reductionism accounts for how diverse representations can be complementary.
For this claim to be plausible, Explanatory Reductionism must be distinguished from certain ontological assumptions (Ontological Reductionism) which are usually assumed to underwrite Explanatory Reductionism. Such assumptions (e.g. concerning the ontological priority of the micro-level), if true, would indeed undermine Scientific Pluralism. In the second part of the paper, I show why Explanatory Reductionism is not committed to these assumptions.
Putnam and Oppenheim’s traditional account of levels conceives the world as hierarchically organised into naturally given structures defined by part–whole relations: higher-level entities (e.g., organs) are composed of lower-level components (e.g., tissues) (Putnam & Oppenheim 1958). Despite its intuitive appeal, this conception has been widely criticised for its conceptual ambiguity (Potochnik 2017). Competing accounts appeal to heterogeneous criteria—including spatial composition, mechanistic organisation, and spatiotemporal scale—without any necessary connection between them.
A prominent response to this ambiguity is the adoption of scale-based approaches, which shift the focus from levels to scales and deliberately avoid reliance on spatial part–whole composition (Ladyman & Ross 2007; McGivern & Rueger 2010; Batterman 2021). However, this raises a question: if scales are not unified by spatial mereological relations, what binds scales into a hierarchy?
In order to answer this question, I argue that scale hierarchies can be sustained only if they are grounded in a non-spatial relation of dependence. Dependence, I suggest, is a necessary condition for any scale-based account of hierarchical organisation. To defend this claim, I examine two competing strategies for explaining such dependence. On the first approach, dependence is an asymmetrical relation between fundamental physics (covering broader scales) and the special sciences (restricted to limited spatio-temporal domains) (Ladyman & Ross 2007). On the second approach, macro-scale behaviour depends on micro-scale behaviour insofar as the former is a non-spatial part—or subset—of the latter (McGivern & Rueger 2010).
David Lewis's original Best System Account (BSA) of Laws of Nature is anything but pluralistic. However, modern variants thereof — the Better Best System (BBSA) and Pragmatic Best System Analyses (PBSA) — allow for a multitude of different sets of laws for different sciences. In this talk, I will examine how these BSA-variants are pluralist, also in contrast to Lewis's monism. First, regarding the status of their pluralism: Are they epistemic or ontological pluralisms? How realist about laws are the resulting B/PBSA theories? Second, regarding its structure: Are the best systems they yield hierarchically organised or radically independent? And what about the fundamental physical laws on these theories? Although there are clear and strong pluralist aspects within the B/PBS Analyses, I will tentatively argue that monist residues might remain — not least because the special-science systems appear to presuppose a shared mosaic, and supervenience relations continue to do load-bearing work even in the pluralist variants. The upshot may be the possible existence of a (unique) ontological fundament with various supervenience relations holding between the different levels of inquiry/reality.
Drawing on manipulationist theories of causation, I propose a manipulationist theory of chance according to which chance consists in the stability of probabilistic distributions relative to classes of interventions treated as identical with respect to a system’s initial conditions. On this view, chance arises when such a class of interventions yields a stable probability distribution over outcomes without determining any particular outcome with certainty. An event counts as chancy when it is associated with a non-trivial probability of occurring under a given class of interventions on the system’s initial conditions. Chance is thus relative to a class of interventions, insofar as they structure the system into conditions that generate stable outcome distributions without allowing full control over individual outcomes.
This theory of chance is agnostic with respect to determinism: stable probabilistic distributions under interventions may be compatible with both deterministic and indeterministic underlying dynamics. Although the view could be interpreted in realist terms, I defend a perspectivalist interpretation. On this view, probabilistic distributions should not be understood as intrinsic properties of the world, independent of any perspective. Therefore, chance is not a property of reality, but a perspectival feature of intervention-grounded descriptions of it.
This account thereby moves beyond the traditional dichotomy between epistemic and objective chance. It allows for a non-reductive conception of chance that avoids commitment to determinism or indeterminism while preserving the explanatory role of probabilistic reasoning in scientific practice.
In this talk I address the topic of diversity and unity by building on my previous work on perspectival realism (Massimi 2022) and arguing for the need to level the epistemic playing field in science. I will present research originating from my recent RSE project "Ocean and Us" (Massimi, Brown, and Jaspars, OUP 2026) to illustrate the epistemic role and value of varieties of local knowledges for science.
Triggered by interrogations on the demonstrative value of empirical studies of scientific practice for different types of ontological claims, this presentation underlines the pluralistic elements in the precise and demanding form of metaphysical naturalisation put forth by Ladyman & Ross (2007) and their subsequent writings. The persistent tension between naturalisation and unification, and a critical discussion of their arguments for its attempted resolution, lead us to assert the likeliness of a solidarity between their naturalistic / scientistic stance and a pluralist stance. It is then proposed that this could be considered an a fortiori argument for the general convergence between metaphysical naturalism and ontological pluralism.
If a methodological and a priori choice on the primary relevance of current science for metaphysics is directly tied to the stance of ontological pluralism, there is need for a reconsideration of what a study of scientific practise or theory can adequately hope to bring to the metaphysical discussion. Discourse on general “ontological stances”, including ontological pluralism, has to accept and defend its legitimacy as a stance à la van Fraassen in order to remain robust and to avoid diving into a circular reasoning which would take its interpretation of scientific practise as a proof of what is actually an already-implicit position.
Kerry McKenzie recently challenged the possibility of effective metaphysics in physics. At the core of her argument stand three claims: (i) Successive theories ought to be metaphysically compatible in their overlapping domains; (ii) The metaphysical content of effective theories is typically different from that of their successors and the approximate theories derivable from them; (iii) Metaphysical propositions are not amenable to approximations in contrast to their mathematizable physical counterparts. My goal in this talk is to address McKenzie's argument and outline what I take to be the most promising route for the project of effective metaphysics (and thus for a pluralistic metaphysics in physics). In short, while (ii)-(iii) seem to hold overall, (i) is too demanding in my view. The situation is indeed far worse than what McKenzie envisages: typically, the derivation of approximate theories from a putatively fundamental theory already involves metaphysical change. Naturalized metaphysicians thus seem to be confronted with a dilemma: either (a) to forgo their naturalistic aspirations by denying metaphysical content to respectable non-fundamental theories, or (b) to inflate their metaphysical commitments by taking seriously non-fundamental theories and their coarse-grained physical content. I mount a preliminary defense of (b).
A common tendency in the metaphysics of science is to treat non-causal dependencies such as realization and part-whole relationships as metaphysically and formally akin to causation. Against this tendency, I argue that the metaphysics of science should be pluralist about dependence relations, recognizing non-causal dependencies as structurally different from causal ones and requiring distinct formal representations. Drawing on recent joint work with Andreas Hüttemann, I motivate this view from within the interventionist account of causation and the causal modelling framework associated with it, showing that key features of non-causal dependencies resist assimilation to causation and motivate treating them as mutual dependence relationships within causal models. Beyond proposing a pluralistic picture of dependence, I highlight three further contributions of this framework to a pluralist agenda in the metaphysics of science. First, it provides an attractive solution to the causal exclusion problem, perhaps the most formidable threat to ontological pluralism. Second, it supports a particularly substantive form of pluralistic physicalism that vindicates the full causal efficacy of high-level properties and allows for a robust form of downward causation. Third, it entails that in certain multilevel settings the causal facts of a situation can be legitimately described in multiple ways, yielding a form of pluralism about causation itself, though one that is interestingly different from more standard varieties of causal pluralism.