Minimizing worst case error

It’s very satisfying to know that you have a good solution even under the worst circumstances.

Worst-case thinking doesn’t have to be concerned with probabilities, with what is likely to happen, only with what could happen.

But whenever you speak of what could happen, you have to limit your universe of possibilities.

Suppose you ask me to write a program that will compute the sine of a number.

I come up with a Chebyshev approximation for the sine function over the interval [0, 2π] so that the maximum approximation for any point in that interval is less than 10-8.

So I proudly announce that the in the worst case the error is less than 10-8.

Then you come back and ask “What if I enter a number larger than 2π?” Since my approximation is a polynomial, you can make it take on as large a value as you please by sticking in large enough values.

But sine takes on values between -1 and 1.

So the worst case error is unlimited.

So I go back and do some range reduction and I tell you that my program will be accurate to within 10-8.

And because I got burned last time by not specifying limits on the input, I make explicit my assumption that the input is a valid, finite IEEE 754 64-bit floating point number.

Take that! So then you come back and say “What if I make a mistake entering my number?” I object that this isn’t fair, but you say “Look, I want to minimize the worst case scenario, not just the worst scenario that fits into your frame of thinking.

” So then I rewrite my program to always return 0.

That result can be off by as much as 1, but never more than that.

This is an artificial example, but it illustrates a practical point: worst-case scenarios minimize the worst outcome relative to some set of scenarios under consideration.

With more imagination you can always come up with a bigger set.

Maybe the initial set left out some important and feasible scenarios.

Or maybe it was big enough and only left out far-fetched scenarios.

It may be quite reasonable to exclude far-fetched scenarios, but when you do, you’re implicitly appealing to probability, because far-fetched means low probability.

Related post: Sine of a googol.

Leave a Reply