Fundamental theorem of calculus generalized

The first fundamental theorem of calculus says that integration undoes differentiation.

The second fundamental theorem of calculus says that differentiation undoes integration.

This post looks at the fine print of these two theorems, in their basic forms and when generalized to Lebesgue integration.

Second fundamental theorem of calculus We’ll start with the second fundamental theorem because it’s simpler.

In it’s basic form, it says that if f is a continuous function on an open interval I, and a is a point in I, then the function F defined by is an antiderivative for f on the interval I, i.

e.

for all x in I.

In that sense differentiation undoes integration.

If we remove the requirement that f be continuous, we still have F‘ = f almost everywhere as long as f is absolutely integrable, i.

e.

the integral of |f| over I is finite.

In more detail, at every Lebesgue point x, i.

e.

every point x that satisfies First fundamental theorem of calculus The first fundamental theorem of calculus says that if the derivative of F is f and f is continuous on an interval [a, b], then So if F has a continuous derivative, then integration undoes differentiation.

What if F is continuous and but differentiable at almost every point rather than at every point? Then the theorem doesn’t necessarily hold.

But the theorem does hold if we require F to be absolutely continuous rather than just continuous.

A function is absolutely continuous if it maps sets of measure zero to sets of measure zero.

It’s not easy to imagine continuous functions that are not absolutely continuous, but Cantor’s function, a.

k.

a.

the Devil’s staircase, takes the Cantor set, a set of measure zero, to a set of measure one.

The usual definition of absolute continuity, equivalent to the one above, takes the ordinary definition of continuity and chops into n pieces.

That is, for every ε > 0 and for every n, there exists a δ > 0 such that for any collection of n intervals of total length less than δ, the sum of the variation in f over all the intervals is less than ε.

If n = 1 this is the definition of uniform continuity, so absolute continuity is a more demanding criterion than uniform continuity.

.

Leave a Reply