Test
Philosophy · Logic | 18 March 2026

Test

Test

Once upon a time, there was a mountain, in the mountain there was a temple, in the temple there was a tall tree, and on it hung many, many people… 【Dense Fog】

In the early stages of mathematical analysis (laughs), we learn the concepts of differentiation and integration and get a vague feeling: differentiation and integration are inverse operations of each other. Let’s re-examine this concept using a one-dimensional lens.


1. What is Differentiation?

If a function $f$ is differentiable at point $x$, we can approximate it linearly. Geometrically, we are looking at the slope.

\[f(y) = f(x) + a(y - x) + o(y - x)\]

The term $o(y-x)$ is the “magic” part—it’s an error that disappears faster than the linear part as we get closer. We usually call $a$ the derivative, denoted as $f’$.

2. What is Integration?

Think of (Riemann) integration like eating a hamburger. One bite at a time, you finish the whole thing. We divide the area into tiny rectangular “bites”:

\[\lim_{\max |x_{i+1} - x_i| \to 0} \sum_{i=0}^{n-1} a(\xi_i)(x_{i+1} - x_i) = \int_0^1 a(x) \, dx\]

Try it yourself! 🍔

Move the slider to see how “taking more bites” (increasing $n$) makes our approximation perfect.


3. Are They Really Inverses?

Case A: Integrate, then Differentiate

If we define $f(x) = \int_0^x a(t) \, dt$, then $f’(x)$ is just the growth rate of that area. Since the area grows by exactly $a(x)$ at that moment, we get our original function back. Success!

Case B: Differentiate, then Integrate

Using the Lagrange Mean Value Theorem, we can show that: \(f(x) - f(0) = \sum f'(\xi_i)(x_{i+1} - x_i) \to \int_0^x f'(t) \, dt\) This is the famous Newton-Leibniz formula. I just re-proved it. 2333333!


4. Wait… Is there a trap? ๑´ڡ`๑

Is it always this perfect? Consider a “step” function $a(x)$.

\[a(x) = \begin{cases} 1 & 0 \le x < 1 \\ 0 & \text{otherwise} \end{cases}\]

When we integrate this to get $f(x)$, we find that $f$ has “sharp corners” at $x=0$ and $x=1$. At those corners, $f$ is not differentiable. > The Mathematician’s Dilemma: > If we have a problem at one point, do we ignore it? What about two points? Or a countable number of points? If we change a function at too many points, it might stop being Riemann integrable entirely!

Want to see how mathematicians fixed this? This "trap" is exactly why Lebesgue integration was invented. But that's a story for another mountain, another temple, and another hamburger.

subkiy
subkiy

subkiy is Professor of Philosophy at the University of Edinburgh, where she specialises in the history of logic and the philosophy of mathematics. Her most recent book is The Formal Turn (Princeton, 2024).

18 March 2026