Juliette Garcynsky (Sorbonne Université & IHPST)
The evolution of the concept of age and theories of aging: epistemological issues
There is no consensus on aging in biology, particularly regarding what the term itself designates —that is, its specific object. Consequently, the concept of "aging" remains unclear. At the same time, evolutionary theories of aging are plural, and the question of their potential unification is complex. From this starting point, we seek to examine the legitimacy of the evolution of a concept closely linked to aging : the concept of "biological age". We believe that the evolution of this concept would be highly useful, based on the hypothesis that theories of aging—notably those relating to social evolution —provide sound reasons to re-examine the "biological" nature of the concept of age. We posit that there would be a considerable epistemic gain in replacing it, insofar as the development of an alternative concept of age—which we shall call "biosocial"—would allow for a different foundation for the measurements linked to it and, potentially, for the theories of aging themselves. The primary interest would be to better understand the theoretical gap, both epistemological and ontological, existing between the two, and more precisely, the advancement offered by the substituting concept. If it is only through the explanation of the first concept (biological age) that we can grasp the specificity of the second, we will also gain a new—and hopefully better—understanding of the first concept through the lens of the second, notably by explaining its limits and singularities.
Juan Carlos Sánchez Hernández (Universidad Autónoma Metropolitana - Iztapalapa)
Moral as an adaptation guides evolutionary epistemology of modality
Few philosophers have considered evolutionary epistemology regarding our capacity to make modal judgements. The remarkable exceptions are: Nozick’s evolutionary skepticism of necessary truths, for Nature favors what simple and effective thinking, although it is not strictly accurate; Williamson’s account of modal thinking as a byproduct of counterfactual reasoning, which we might regard as an adaption for learning from our mistakes and to act different in similar states of affairs; and, finally, Vetter’s evolutionary defense of dispositionalism, since the ability to recognize dispositional properties allowed us to modify elements in our environment. In this paper I suggest that, since modal verbs work both for modal statements and deontic statements, the evolutionary studies of morality might help us to understand how modal reason might have also been subject to evolutionary pressures.
Nicola Bertoldi (ERC AdG “Oecologie – The German sources of ecology”, CEREG, Université Sorbonne Nouvelle)
The strange case of Doctor Charles and Herr Darwin. How did German “Darwinismus” differ from Darwin’s thought?
As highlighted by studies such as Pietro Corsi’s, the reception of Darwin’s On the Origin of the Species in continental Europe was twice skewed: on the one hand, the core theses of Darwin’s theory of evolution by natural selection were readily accepted and popularized by an intellectual milieu external to mainstream scholarly institutions; on the other hand, such enthusiastic purveyors of Darwinian ideas tended to frame On the Origin’s theses through their own preconceived understandings of evolution. The case of Darwin’s reception in the German-speaking world is particularly interesting from this standpoint. Sander Gliboff’s study on the diffusion of On the Origin’s German edition notably argues that figures such as Heinrich Georg Bronn and Ernst Haeckel contributed to filtering Darwin’s thought through the legacy of German Naturphilosophie, e.g., Goethe’s morphology and theory of metamorphosis.
This contribution aims to shed further light on the history of Darwin’s reception through the following question: How did “Darwinismus,” stemming from German translations/interpretations of On the Origin of Species, differ from Darwin’s thought?
To this aim, it deploys two methodological approaches. First, it provides a distant reading of all German translations of Darwin’s works accessible on the website https://darwin-online.org.uk/. Through topic modelling and co-occurrence analysis, this approach pinpoints lexical-conceptual patterns distinctive of how Darwinian concepts and ideas were assimilated within the German context, e.g., how the rendition of “origin” as “Entstehung” affected the semantic field associated with evolutionary terminology. Second, this contribution adopts a history-of-concepts approach to elucidate the structure and function of concepts such as “Entstehung” and “Entwickelung” within the broader intellectual and social context of Darwin’s reception in Germany. This conceptual-historical analysis relies on a close reading of relevant documents identified through the first approach.
Bernardo Yáñez (Dirección de Antropología Física-INAH)
Darwinism, Modern Synthesis, and Extended Evolutionary Synthesis: an account of the process of evolution of evolutionary theory
Within the framework of contemporary studies on the history and philosophy of the life sciences, evolutionary theory constitutes a paradigmatic case for analyzing processes of long-range conceptual transformation. Modern biological evolutionary theory sinks its roots into the transformist thought of the nineteenth century, which redefined the foundations of Natural History and prepared the ground for the formulation of Darwinism. With the publication of On the Origin of Species (1859), natural selection was established as the central explanatory principle of organic change, inaugurating a new theoretical matrix for understanding biological diversity. This work examines, from a historical-epistemological perspective, three fundamental moments in the configuration of evolutionary thought: classical Darwinism, the Modern Synthesis, and the Extended Evolutionary Synthesis. Through the analysis of their ontological assumptions, explanatory models, and methodological strategies, continuities, tensions, and conceptual displacements between these phases are identified. The study proposes that the history of evolutionary theory cannot be understood as a mere accumulation of empirical knowledge, but rather as a dynamic process of theoretical rearticulation that reflects broader changes in the understanding of causality, inheritance, and the organization of the living.
Max Fernández de Castro (Universidad Autónoma Metropolitana - Iztapalapa)
Evidence in favor of Church’s thesis
In this talk, I will briefly review the types of arguments presented in favor of Church’s thesis: those of Church and Kleene, the convergence of attempts to define computability, Turing’s analysis, Kripke’s argument, and the axiomatic proof. I will explore whether it is possible to prove the adequacy of a concept to an intuitive notion.
Eduardo Ugalde Reyes (Posgrado en Filosofía de la Ciencias, UNAM)
A Brief Excursion into the Notion of Proof Through the Curry-Howard Correspondence
Much contemporary philosophy of mathematics seeks to explicate the notion of proof. Yet many proposals proceed either through metaphysical theorizing about proofs as abstract objects or through naturalized epistemology that shifts attention to the psychology and sociology of reasoning, often leaving mathematical practice itself in the background. This talk returns to proof as it appears in constructive mathematics by using the Curry–Howard Correspondence as a guiding framework. We use the correspondence to make precise the dual aspect of proofs: epistemologically, as acts that confer grounds to judgments; ontologically, as structured mathematical objects (proof-objects) governed by formation and computational rules. On this basis we show how central proof-theoretic phenomena—especially normalization—clarify what it is for a proof to justify a conclusion. The overall aim is not a new metaphysics of proofs, but an account of proof that stays close to the discipline’s own standards of correctness and construction.
Edgar Enrique Solís de los Reyes (UNAM)
Some examples of changes in mathematical knowledge
There are some mathematical results that represent points of change in mathematical knowledge, they are cornerstones. These results are answers to specific problems. In this lecture, I am going to show the context of some of these problems and the process of their solutions, and I am going to explain how these results meant changes and some of their consequences for the development of mathematical knowledge.
Anabel Jáuregui Hernández (SECIHTI-Facultad de Ciencias, UNAM)
From cyclic quadrilaterals to the symmedian point: antiparallel lines in geometry
In this talk, the origin and development of the concept of “antiparallel lines” will be reviewed. In modern geometry books this concept is usually employed; however, the history of its origin and development has rarely been treated in the existing literature. The definition of antiparallel lines is equivalent to that of a cyclic quadrilateral; for this reason, our study of this concept will begin with a review of the role of the cyclic quadrilateral in Euclid’s Elements. Subsequently, we will see the explicit formulation of the concept and the study of the properties of antiparallel lines in the Nouveaux Éléments de Géométrie (1667) of A. Arnauld. Finally, we will review a very interesting episode in the history of geometry known as the New Geometry of the Triangle; we will reflect on how antiparallel lines were incorporated into geometry in relation to new geometric objects, such as the symmedian lines and the symmedian point.
Sébastien Maronne (Institut de Mathématiques de Toulouse)
Birth of algebraic geometry? Descartes and Fermat
I will inquire into the birth of algebraic geometry, and of the concepts and methods that characterize it, by studying the Isagoge ad Locos Planos et Solidos of Fermat and Descartes’ Geometry, and the lines of force that run through this theory at the crossroads of algebra, geometry, and number theory.
David Rabouin (SPHERE, CNRS & ERC Philiumm)
Between Euclid and Al-Khwarizmi: How to Write the History of the “Irrational” in Mathematics?
The concept of what we now call “irrational” appears in mathematics in Euclid's work in the form of a contrast between what is expressible (rhete) and what is alogos (an ambiguous word in Greek, since it can mean either “without ratio,” “without reason,” or “which can not be stated”). It is this latter meaning that will be favored by Arab authors, who refer to “surd” quantities, assam (literally “deaf”, but once again the word can mean “without reason”). With the development of algebra, they will gradually expand this category of geometric magnitudes to include numbers that the Latins will call numeri surdi. We can thus see the concept of irrationality changing its scope of application to gradually take on the modern meaning of “irrational number.” But can we really say that it is the same concept from one context to another? By revisiting some historiographical debates, I would like to show that the matter is more complex than it appears at first glance.
Simon Gentil (Sphère (Paris) - IMT (Toulouse))
Geometric loci and curves
My presentation will have as its objective to comment on the manner in which the notion of geometric locus nourished the development of the mathematical concept of curve in the seventeenth century. In particular, I will return to the notion of geometric locus among the Ancients before analyzing the modifications introduced by the Moderns to justify their conception of curves.
Carmen Martínez-Adame (Facultad de Ciencias, UNAM)
The evolution of the concept of a function in Euler's early works
This talk examines the early (and less well-known) development of the concept of a function in the works of Euler, focusing on the transition from Johann Bernoulli’s 1718 definition to Euler’s own formulation in the Introductio of 1748. Through a close reading of key texts, it traces how the notion of a function evolved from a geometrically grounded description of quantities associated with curves to a more general analytic framework.
Jean Dhombres
To question the denominations given to what mathematicians deal with: objects, percepts, concepts?
My questioning, based on three words—and I could have added "things" in the manner of Aristotle—may not concern a mathematician engaged in the act of creation, which nonetheless involves a collective element and therefore questions of vocabulary, as we have known since Euclid's Elements. My introspection touches on the narrative or account, even the judgment, of historians and philosophers, but also that of authors of mathematical treatises (and not only Bourbaki). I want to frame questions on a case that did not appear before the scientific revolution, that of the notion of function, invented by Leibniz in 1673 although underlying that of "law of nature", and a source of ambiguity in the 18th century despite a major practice with the various functional equations. It received a delimitation for the variable, therefore perhaps becoming an object (interval of definition) only with Fourier at the beginning of the 19th century; it was enriched with the qualifications of regularity (continuity, differentiability, Lipschitz character, then computability today) which transform such functions into so many “percepts” in the manner of Deleuze. The notion was fixed as a concept by Cantor, and seemed to lose its connection with the notion of a variable, due to Lebesgue's "almost everywhere" conception, and above all the Schwartz and Sobolev distributions, seen as “generalizations” of functions. It has become common vocabulary (a “thing)” both in secondary education and in the economic world, although losing the quantitative notion of a law in favor of that of a shape (for example an S-curve or Σ), and perhaps supplanted by the notion of an algorithm whose complexity is measurable. Essentially, starting from a history of functions that has already been largely written and was proven to have been very changeable, the aim is to probe the formal and syntactic processes of a mathematization of causality.
Favio E. Miranda-Perea (Facultad de Ciencias, UNAM)
Some reflections about the notion of truth based on Proof Theory
Truth has been one of the central subjects in Philosophy since ancient times. In logic the notion of truth is usually identified with Tarski's theory where sentences are considered as bearers of truth, that is a sentence A is true if and only if A denotes a fact with respect to the "actual world". Although strongly established, for instance as the root of Model Theory, this position has been challenged for instance by Dummet which claims that such mathematical notion cannot simultaneously serve to determine the concept of truth and the meaning of sentences and instead one has to take one of these two notions as given. The purpose of this talk is to present and reflect on an alternative concept of truth emerged from Proof Theory, a branch of mathematical logic traditionally considered as part of the syntax and thus not related to semantics. This evolved concept of truth, where proofs are semantic values of sentences, constitutes the basis of the so called Proof-theoretic semantics, helps to to solve Dummet's criticism and provides a tool to solve the question whether a computer-assisted proof guarantees the true of a theorem.
Isabel Hernández Paredes (Posgrado en Filosofía de la Ciencias, UNAM)
Fray Diego as a Reader of Pappus in Seventeenth-Century New Spain
In this talk, I explore the reception and reworking of Pappus of Alexandria’s Mathematical Collection in seventeenth-century New Spain through a specific case: Proposition 164 of Book VII. Drawing on manuscript evidence, I suggest that, despite the absence of this text from known inventories, its circulation in the New Spanish context can be traced and was likely mediated by Federico Commandino’s Latin edition (1588), whose translation and commentary served as a key vehicle for the transmission of Pappus’s work in the early modern period.
Against this background, I focus on how this problem reached the hands of fray Diego Rodríguez and how he engaged with it mathematically. I argue that, rather than being a passive recipient, Diego subjected the problem to a rigorous and original treatment, positioning himself within the broader geometrical discussions of his time as the first professor of mathematics at the Royal and Pontifical University of Mexico.
Guillermo Zambrana (Universidad Autónoma Metropolitana - Iztapalapa)
The transition from concept to object. The infinite in mathematics
Starting from a review of the calculation of the length of a spiral, proposed by John Wallis (Arithmetica Infinitorum), the decisive impetus toward the ‘objective’ nature of infinitely small magnitudes as ‘objects’ subject to well-defined (arithmetic) rules of operation will be discussed.
Two key aspects will then be analyzed: the solution of geometric problems (Wallis, Pascal) by accepting and assuming these magnitudes as legitimate, and the demonstrative and foundational value shown by the mere application of the rules to the objects mentioned here.
Aurore Franco-Ricord (Université Paris Panthéon 1 Sorbonne )
Metzger and scientific knowledge – How does transcendental imagination illuminate the formation and evolution of scientific concepts?
This communication proposes to reexamine the contribution of Hélène Metzger to historical epistemology by situating it between two poles: the analysis of the sciences as rational structures and their understanding as cultural phenomena. Contrary to a focus on the sole context of justification, Metzger invites us to think about the conditions of elaboration of scientific concepts.
Taking up the Kantian framework, she assigns a central role to transcendental imagination. The synthesis between experience and intellect would not be primarily conceptual, but rather analogical and metaphorical. Scientific concepts, like other general concepts, are formed through a work of analogy that is rooted in a common mythopoietic background, upon which logico-mathematical formalisms come to be structured. This transcendental is, however, historicized: it refers to collective, cultural, and anthropological forms of imagination.
Science thus appears as a historicized rhetoric, and the philosophy of science as an anthropology of symbolic practices. This perspective will be illustrated through the comparative analysis of contemporary scientific concepts from heterogeneous fields. It will be shown that the Metzgerian approach offers an innovative and relevant methodology for thinking the history of science as a cultural history of forms of thought, capable of articulating analytical rigor and attention to the historical conditions of rationality.
Christelle Montjean (IHPST - Paris 1 )
The persistence of the concept of genus in biological classifications
The concept of genus in biological taxonomy constitutes an example for analyzing the transformation of scientific concepts through theoretical changes. Introduced within the Linnaean framework (eighteenth century) as a central element of the classification of living beings, genus has never been the object of a strict definition, and its ontological status has remained problematic from the outset. Subsequent developments in evolutionary biology, and then in phylogenetic systematics, have called into question the legitimacy of genus as a natural category, by showing its dependence on choices of criteria and on classificatory traditions.
Nevertheless, genus has not disappeared with these theoretical transformations. It has been reconfigured and maintained because of its role in binomial nomenclature and its relational function, allowing species to be situated within a network of evolutionary proximities, even if partially reconstructed or artificial. The study of genus thus brings to light the distinction between the ontological value of a concept and its epistemic and practical value, and calls into question the conditions for the persistence of scientific concepts despite the undermining of their theoretical foundations.
Ana Barahona Echeverría (Facultad de Ciencias, UNAM) & Marco Ornelas-Cruces (Facultad de Ciencias, UNAM)
From the field to the laboratory: when genetic engineering transformed biological nitrogen fixation in Mexico during the 1980s
During the 20th century, biological nitrogen fixation (BNF) was understood as a phenomenon strictly associated with free-living bacteria or symbionts of leguminous plants. This approach integrated concepts from microbiology and plant physiology, and focused on the symbiotic process as the object of study. Therefore, scientific explanations were organized around the identification of microorganisms, the analysis of their interactions with plants, and the characterization of the BPN process. Towards the end of the 1970s and early 1980s, the development of bacterial molecular genetics and genetic engineering allowed these problems to be rethought from new experimental practices. BNF began to be explained at the level of specific genes and metabolic pathways, shifting the object of study from ecological interaction to genetic engineering and sequencing. This change did not imply the abandonment of symbiosis, but rather its insertion into a different set of practices, instruments, and research questions. In Mexico, this process enabled the development of genetic engineering in a context marked by the search for self-sufficiency and food production, as well as the reduction of chemical fertilizer use. Thus, BNF acquired strategic relevance in a local context with studies on nitrogen-fixing bacteria and their genes. The possibility of genetically transferring nitrogen fixation processes to other organisms using genetic engineering techniques redefined the objects of study. What was traditionally researched in large controlled crops began to be studied—and manipulated—in in vitro systems using molecular cloning techniques. This paper allows us to reflect on how scientific practices and concepts can be transformed and acquire new meanings when integrated into reconfigured experimental systems. Conceptual evolution thus appears to be linked to the co-production of concrete research practices, rather than to abstract definitions or abrupt theoretical breaks.
Caroline Angleraux (Inserm U 1253 - iBrain)
The Evolution of Cell Theory: Renewed Heuristic Power?
Cell theory is one of the central structuring paradigms of biology, alongside the theory of evolution by natural selection, the theory of heredity, and the material theory of life (Gayon 2014). Its status as a paradigm stems from its ambition to provide an explanation of living systems that is both exhaustive and reductive: all living beings have cellular forms, and cells are the irreducible organic units.
Yet, although there is no living processes without cells, the study of the cell as such often appears to recede into the background. It is frequently overshadowed either “from below,” by a focus on molecular processes, or “from above,” by studies centered on the organism or the ecosystem. In the former case, the cell is approached primarily as a site of molecular activity (Bechtel 2010); in the latter, it is treated as an underlying component lacking autonomy within a broader system. When the cell is taken as an explicit object of study (as in systems biology) it tends to appear as an intermediate level that helps connect molecular dynamics to the behavior of organisms or ecosystems.
Some disciplinary fields, such as developmental biology, do study the cell in its own right, treating it as an autonomous object of investigation. However, in this case, the emphasis is placed primarily on the specific cellular models employed; the unity of the concept itself remains in the background, while the diversity of cellular forms comes to the fore.
Then the cell concept is both central and yet marginal within the theoretical network. It reveals specific tensions that have also served as sources of conceptual strength: theoretical polysemy, involvement in an analytical reductionist approach, and persistent vagueness in the identification of cell types. This talk aims to map these tensions and to propose philosophical avenues for bringing the concept of the cell back to the foreground.
Brian Becerra-Bressant (Facultad de Ciencias, UNAM)
Evolutionism in Mexico: the first contents in the National Preparatory School and other educational institutions (1867-1897)
In Mexico, it is unclear how evolutionary content was chosen and introduced into the first textbooks used at the National Preparatory School. The school was founded in 1867 based on a positivist scientific model. Its first director, Gabino Barreda (1818-1881), considered incorporating the teaching of natural history into the curriculum and, with it, notions of evolutionism. However, classical literature suggests that evolutionism first appeared in schools in 1877, in the book Compendio de la Historia de la Antigüedad (Compendium of Ancient History), written by Justo Sierra (1848-1912) (Moreno, 1984). After reviewing the aforementioned book, it was found that there are only two paragraphs that discuss Darwinism and its basic concepts. On the other hand, recent literature suggests that evolution officially appeared in Mexico in the early 20th century, through the books and courses of Alfonso Luis Herrera (1860-1942) (Cueva, 2024).
In accordance with the previous, this paper analyzes how and when the first evolutionary content appeared in Mexican schools, demonstrating that Sierra's Darwinism was not necessarily the only or the first to appear on this subject, and taking into account that the documents from 1867 to 1877 have not yet been reviewed in detail. On the other hand, it analyzes the possible evolutionary content that appeared between 1877 and 1887, with the intention of showing the richness of evolutionary notions prior to the turn of the century and the official presence of evolution in the country.
This paper is part of the author's PhD research.
Juan Manuel Rodríguez Caso (Universidad Autónoma Metropolitana - Xochimilco) & Erica Torrens (Facultad de Ciencias, UNAM)
Scope of the deconstruction of a scientific theory: a historical review of Darwinism (and its synonymies)
This paper explores the conceptual and methodological limits of deconstruction applied to scientific theories, taking as a case study “Darwinism” and its multiple denominations: natural selection, biological evolution, neo-Darwinism and the modern synthesis, extended synthesis, among others. It examines the way in which, in historical terms, speaking of a theory—apparently—unified responds more to a narrative than to a properly theoretical development.
The work critically reviews the terminological polysemy that surrounds Darwinism, identifying how different scientific communities have re-signified its central postulates. The philosophical implications of “deconstructing” a current scientific theory are discussed: does it imply refutation, transformation, or interpretative multiplicity?
Through a historical survey of key controversies—the transformist proposal of Jean-Baptiste Lamarck, the Darwin–Wallace debate, up to the criticisms of saltationism and developmental biology—it is evaluated whether deconstruction can coexist with empirical validation or whether it necessarily represents a critical operation external to scientific practice. The aim is thus to reflect on the pertinence of applying postmodern hermeneutic tools to the analysis of theories in the natural sciences.
Solange Haas (Université de Pau)
Quantophrenia in the production of predictions on biodiversity and public decision-making
An expression of a biological diversity in crisis, “biodiversity” must respond to the political challenge of its conservation (Casetta & Delord, 2014). Since the mid-1980s, part of the efforts in ecology and conservation biology has been concentrated on the quantification of biodiversity at its different levels of integration and spatio-temporal scales (Devictor, 2015). We will endeavor to show that the reduction of biodiversity to its quantified representations can be described as ‘quantophrenic’ (Desrosières, 1993; Porter, 1995) and also affects the production of predictions.
Relying on a distinction between uncertainties (among which quantified imprecision) and ignorances (which emerge when scientific uncertainties are confronted with a norm of action), we will show how this quantophrenia has as a consequence the emergence of a form of scientific ignorance in which biodiversity appears as an exclusively technical problem, thus orienting public decisions.
Paola Hernández Chávez (Universidad Autónoma Metropolitana - Azcapotzalco) & Jonatan García Campos (Universidad Autónoma Metropolitana - Azcapotzalco)
The automatization of ‘seeing’, ‘looking’, and ‘creating’ through the use of AI: Critical observations from Cognitive Sciences
After questioning the issue of to what extent artificial intelligence is “intelligent” and to what extent it is “artificial”, as well as what we can expect from it, we analyze what happens to a significant number of human cognitive faculties—starting with vision, memory, and learning processes, among others—in the face of the massive use of Artificial Intelligence, exposure to electronic devices, and digitalization.
As we know, in the last decade—and especially after the 2019 pandemic—a significant portion of people started using electronic devices for communication, work, education, and recreation. This trend increased with the mass adoption of generative Artificial Intelligence in recent years. A series of automated AI generators became widely available.
In this work, we use the massive adoption of artificial intelligence generators and the overexposure to screens and electronic devices to analyze their eQects on our cognitive human capabilities. This analysis adopts an evolutionary perspective. We question which skills—once fundamental for our survival—are at risk due to the extensive use of AI and electronic devices.
As a case study, we examine the human ocular morphology in relation to social cooperation to show how exposure to digital environments is altering the physiological and social structure of the human eye, manifested in the loss of visual acuity. In addition, we mention the case of a series of memory processes we are inhibiting due to the vicarious use of electronic devices and their impact on cognitive processes.
We finalize with some critical comments regarding our technofeudal era.
Théophile Richard (Université Paris Cité )
The incommensurability of concepts, between the history of science and developmental psychology
In The Structure of Scientific Revolutions and in his subsequent texts, Thomas Kuhn grants a central place to the notion of incommensurability. This notion is applied to a great many cases which at first glance seem different. The objective of this presentation is to put this notion of incommensurability to the test by comparing (a) the context in which it was first conceived by Kuhn, (b) the more specific case of mathematized theories, and (c) the use made of it in contemporary developmental psychology. The aim of this comparison is to underscore the necessity of distinguishing the types of relations that different conceptual systems can maintain with one another, in order ultimately to nuance the scope of Kuhn’s voluntarism.
Vincent Jullien (Université de Nantes & IHPST)
Does the chôra of the Timaeus make sense in contemporary cosmology?
Note on a two-millennia-long controversy, ‘The matter of the heavens’ or ‘the universal conceptual migration of quintessence.’
The theories of the matter of the heavens will be addressed, without losing sight of a general epistemological thesis, namely, that cosmology, like all other sciences, associates ‘progress of knowledge,’ ‘provisional nature of theories,’ and ‘increase of ignorance.’
An attempt will be made to glimpse the scientific framework that associates the chôra of the Timaeus with the dark energy of contemporary cosmology, passing through Descartes’ subtle matter and Newton’s sensorium Dei. This is, of course, a project a hundred times too vast to be reasonably examined here. Therefore, I will only propose some notes related to it.
Gautier Depambour (Université Paris Cité)
From quantum electrodynamics to quantum optics: example of a transfer of knowledge in light of the modular structure of physical theories
In 2008, the historian of physics Olivier Darrigol proposed a new vision of the formation and evolution of physical theories based on the concept of ‘module.’ A module is a kind of fundamental building block of a physical theory, and is itself defined as a physical theory. It contains at once a symbolic universe, that is, a set of symbols that characterize the states of a physical system; fundamental laws, which condition the behavior of physical systems in the symbolic universe; and interpretative schemes, which give physical meaning to the formalism by associating it with possible experiments to be conceived.
From this perspective, physics appears as a vast network of modules connected to one another, and capable of evolving over time. This approach makes it possible in particular to account for transfers of knowledge that may occur between different fields of physics, and which can be interpreted in this context as a sharing of modules. This is what I will illustrate using an example drawn from my own research, namely that of quantum optics. I will show, indeed, that the birth of this field of research results from a transfer of knowledge from quantum electrodynamics (QED) to the domain of optics, and that QED can, for this reason, be considered as a module of quantum optics.