Continuity

With the concepts of CI/CD, continuity has become a buzzword in software. How often should one integrate is a common interview question for devops engineers. Yet, there are few attempts at defining it formally, and without a definition, practitioners use their intuition. Not that they'd be in bad company: Descartes saw the real numbers as points on a naturally continuous line, and so did Euler, and even Gauss.

It took two German mathematicians, Dedekind and Weierstraß, in the early 1870s, to understand that continuous is not the contrary to discrete, and to offer the formula we learned at school to express the continuity of the function f:

∀ ε > 0, ∃ α  ∕ d1(x, x0) < α ⇒ d2(f(x), f(x0)) < ε

d1 and d2 being here distances, suitable for the domains of respectively x and f(x).

What this tells us is that in order to keep consequences within an arbitrary range of stability (e.g. one we feel comfortable with), we need to be able to limit the amount of changes accordingly.

This story is told by Lakoff and Núñez in their 2000 book: Where Mathematics Comes From, and particularly in part IV: Banning Space and Motion: The Discretization Program That Shaped Modern Mathematics.

Physics had entered modernity with Galileo two and a half century earlier, and would go postmodern in 1905. Maths would have to wait until 1931.