Science develops cumulatively. Discoveries are made, theories are developed, experiments are designed by standing on the shoulders of giants. We are all, in this sense, historical beings; we are children of our time. This is true for revolutions as well. They, too, “need time for their accomplishment'; [they] too, have a history” (Koyré 1957, p. viii).
What we call the scientific revolution of the 16th and 17th centuries* is, in hindsight, a formative process of change in the way we think. Not a process of thinking new thoughts, but one of thinking differently, looking differently, and therefore seeing differently; a period during which "human, or at least European, minds underwent a deep revolution which changed the very framework and patterns of our thinking and of which modern science and modern philosophy are, at the same time, the root and the fruit." (Koyré 1957, p.vii, my emphasis)
The revolution can be characterized in different ways, depending on one's locus of interest. (1) Moving away from a geocentric (geo = earth, kentron = center) towards a heliocentric (helios = sun) astronomy. (2) Moving away from contemplative knowledge to active and operative knowledge**: In Alexandre Koyré’s terms, from man as spectator to man as master of nature. Descartes writes in his Discourse on Method, Part Six: “instead of the speculative philosophy that is taught in the schools, one may find a practical philosophy […] and thus render ourselves masters and possessors of nature” (Descartes 1637/1967). (3) Moving away from teleological and organismic ways of thinking towards mechanical and causal patterns of explanation. (4) The new philosophy “in which the skies no longer announced the glory of God” (Koyré 1957, p.viii) and a corresponding secularisation of consciousness.
All this required and made possible by mathematizing nature, eliminating qualities in favor of quantities, grounding knowledge in measurement and experiment. In other words, by scratching subjective or value-driven concepts from our explanation of nature, by a "[d]ivorce of the world of value and the world of facts." (Koyre, 1957 p.2). For Galileo, for example, subjective qualities were secondary to the object, what was in it were only measurable properties (primary qualities), the rest was in us, not in nature: “I think that tastes, odors, colors, and so on are no more than mere names so far as the object in which we place them is concerned, and that they reside only in the consciousness.” (Galileo, 1623, p.23). This distinction between primary qualities (what belongs to the object itself) and secondary qualities (what belongs only to our mode of sensing it) was not entirely new. Philosophers before and after had touched upon it***, in one way or another. But with Galileo, the distinction became methodological and a condition for the new science.
The success of this new science rests on the division between what is in us and what is out there (between the subjective and the objective). This division was present not only in the physical sciences but also in psychology and cognitive science, where the subjective was studied by cleverly bypassing the subjective. The behaviourism of Watson and Skinner, for example, treated the mind as a black box, i.e., inaccessible and scientifically unmeasurable. So we had to bracket the subjective, the mind, in favor of observable correlations between stimulation and response, the objective. The grandfather of cognitive science, mid-century cybernetics (first-order cybernetics) modelled organisms and machines both in terms of information, control, and feedback: "Now [with cybernetics] is offered, for the first time, a mechanistic model which, it is claimed, applies to material and mental phenomena at once." (Jonas, 2001, p.110)
The old idea that the book of nature is written in the language of mathematics, which can be traced back to Pythagoras, was the extension of the methodological commitment to focus on the objective. If physical reality could be expressed precisely in mathematics, then the same was expected to be true for the mind. If we could model the mind mathematically, we could create one. If something can be stated with enough precision, it can be operated on and thus controlled (scientia activa et operativa). Cybernetics, information theory, and early computer science developed in parallel with this commitment to mathematical formalization. Rosenblatt’s perceptron, developed in the late 1950s, attempted to build a machine that could learn from experience. Shannon showed that communication could be expressed in bits, and von Neumann showed that machines could store and manipulate symbols. Turing's theory of computation showed that any effective procedure could be described through operations on discrete symbols. McCulloch and Pitts combined neurophysiology with engineering and mathematical logic, publishing their 1943 paper proposing artificial neural networks, that is, systems in which logical operations could be realized by neuron-like units. Motivated by their work, Minsky and Dean Edmonds built the SNARC (Stochastic Neural Analog Reinforcement Calculator) in 1951, which was one of the first neural-network machines.
The Dartmouth meeting in 1956 gave a name to the ongoing project: AI. Today, AI models**** operate on the principle that underlying patterns can be captured, abstracted, and predicted from data. This is why AI fits so naturally into the new science since it has already been treating the world as structure, and AI intensifies this capacity. In biology, AlphaFold is the clearest example. Developed at DeepMind under Demis Hassabis and John Jumper, it predicts the three-dimensional structure of proteins from their amino acid sequence with a level of accuracy previously thought was out of reach. In 2024, Hassabis and Jumper received the Nobel Prize in Chemistry***** for this work. In physics, machine learning helps navigate the enormous datasets produced at particle colliders, finding regularities without computing every microscopic detail.
AI and science evolve together. In a sense, AI is the ultimate expression of the ambition of the new science to formalise the world. It provides a technical framework for the methodological commitments that have formed and directed our scientific tradition: the division between what is in us and what is out there (the bifurcation of nature à la Whitehead), the reduction of phenomena to their smallest physical components (reductionism), the ideal of a detached standpoint (objectivity), the identification of the real with the mathematically tractable (surreptitious substitution à la Whitehead) (Thompson et al. 2024).
Science develops incrementally. Concepts crystallise, fade, and reappear in altered forms. Ideas are forgotten, rediscovered, and reframed as our tools and frameworks change. Many of the questions we ask today were already asked by the ancient Greeks about matter, motion, and perception. It can feel as if we are circling the same problems, asking the same questions and arriving at the same but significantly more informed and nuanced answers. The reason why certain questions and answers seem to be reappearing is the human impulse or drive to answer what appear to be the simplest questions: Who am I? Where am I? Why am I here?****** Within the tradition, these questions lost their existential nature because this part of their nature does not fit the methodological commitments of the new science, which is not a tool to be applied to matters of value. This is not a criticism; we have seen above why it had to be so.
What is interesting is that the simple questions that guided all human curiosity, from which our science, philosophy, religion and art have been born, cannot be treated in full by our advanced scientific establishment. They can only be handled mechanically. AI continues the long historical movement in which human thought becomes formalized, mathematized, and operationalized. It marks the triumph of this process. But, it raises again, in a new form, the basic questions: Who am I? Where am I? Why am I here? It remains to be seen whether we will return to these basic questions or move further away from them. If we opt for the latter, we will instead ask why a certain unease persists, and why anxiety and a sense of existential tension survive even after explanations have been handed over to patterns and computations.
* It is usually dated as beginning with the Copernican revolution (De Revolutionibus Orbium Coelestium, 1543) and culminating in Newton’s Principia (Philosophiae Naturalis Principia Mathematica, 1687).
** From scientia contemplativa to scientia activa et operativa
*** Democritus: “by convention sweet, by convention bitter, but in reality atoms and the void.” (DK 68B9). This is to say that secondary qualities are not in the world at all, only in the perceiver. Aristotle: “Sensible qualities are not actualized except in relation to perception; for each of them is a power (dynamis) of affecting the sense, and this power is not actual unless it is perceived.” (De Anima II.5, 417b2–6). Descartes: “We must not think that in the objects themselves there is anything like the sensations we experience of them… For what we perceive as colors, tastes, and so on, are only various sensations in us.” (Principles of Philosophy, IV, §190). Locke: “Ideas of secondary qualities are nothing in the objects themselves but powers to produce various sensations in us.” (Essay II.VIII.10–14)
**** More specifically deep learning models: a subset of AI models built from many-layered neural networks that learn patterns from large datasets.
***** The prize was shared with David Baker, who was recognized for computational protein design.
****** These are the questions about the subject and object and, in my opinion, cannot be treated separetely. “It becomes apparent that if certain facts about our common experience of perception, or what we might call the inside world, can be revealed by an extended study of what we call, in contrast, the outside world, then an equally extended study of this inside world will reveal, in turn, the facts first met with in the world outside: for what we approach, in either case, from one side or the other, is the common boundary between them.” (Spencer-Brown, 1969, p. xxv)