Sunday Function

There's some math here, I'd rate it at Calc 2 difficulty. If you don't know calculus, that's fine! The details will be obscure but I think you'll still appreciate the abstract beauty of the method.

Ok, pick a rapidly oscillating function. It doesn't really matter which, so as an example I'll make one up. It has no particular physical significance, but the method we're going to test out on it ends up being very useful in numerous physical problems. A lot of things oscillate, and many times we're after the overall average effect of those vibrations, not the details of the vibration itself. So, our test subject:


I've put the halves in parentheses to emphasize that there's two parts of interest - a regular old function (in this case an exponential and a polynomial), and a fast-changing oscillating sinusoid. It looks like this:


Now, say this is the graph of velocity. If we integrate that velocity, we can find the total displacement. Integration is just the process of calculating the area between the curve of the graph and the x-axis. Take it to be positive if the graph is above the axis, and negative if below.

For this oscillating graph, notice that each upswing is very nearly matched by a downswing. These will to a very good approximation cancel each other out, and we expect the total integral to be close to zero.

But it's a bit of a pain to actually compute the antiderivative to do the integral exactly, and most numerical methods break down if the oscillation is too fast. We need another way. That way is integration by parts - with some physics-style finagling. Integration by parts takes advantage of the product rule for derivatives, and works like this:


Call the oscillating part "dv" and the other regular part "u", and evaluate:


It's easier than it looks, I promise. Now here's the trick. The integration by parts has two terms. The first can be evaluated straight away, the second requires another integration. However, notice that in evaluating the second integral we'd integrate by parts again, bringing in another factor of 1/30. Thus we might expect that we can leave off that integral entirely and just use the first "surface term" as an approximation. Now as it happens we could have done this integral exactly, so let's see what happens.

Our approximate result: -0.00245746
The actual answer: -0.00252991

Not bad. If we wanted we could iterate again with that second integral and get a much closer approximation.

There's two important things to gain here. First, as the oscillations grow more and more rapid the approximation actually works better and better, due to the increasing smallness of the fraction in front of the oscillating term. This is despite the fact that raw computational methods tend to break down for high oscillations. Second, we don't need to antidifferentiate the non-oscillating part, we only have to evaluate it at the endpoints. That's a major advantage when the non-oscillating part has no clean expression for its antiderivative.

Aside from the practical utility in physics, this line of thought pretty quickly runs into some important areas of pure mathematics. We're splashing around near the shore, but not so much farther out the continental shelf drops off and we're in the deep waters of the theory of integral equations.

Kind of cool, I think.

More like this

Note to the reader: this post is relatively stiff mathematically. For those not mathematically inclined, I think you might enjoy reading it anyway and enjoying it as you would a tour of a widget factory; even if you're not worried about the details of the nuts and bolts, it's fun to see it done.…
Well, it's Gabriel, Gabriel playin'! Gabriel, Gabriel sayin' "Will you be ready to go When I blow my horn?" - Cole Porter, Anything Goes The commenters in last week's Sunday Function proposed an excellent idea for this week. As we did then, we'll start simple and work up to it. Graph the curve f(…
If you're a regular reader of this site, you might remember a post about this fascinating specimen from the collection of unusual functions. I'm only showing it on the interval [-1,1] for reasons that will become apparent, but outside that region the growth tapers off rapidly and the function…
Again, apologies for the hideously scanty posting. Been in the lab doing some really interesting research which will with some luck get me in a really nice journal, as well as doing the various rounds of revision on the paper for some previous research. Also putting two talks together for a…

...the increasing smallness...

You physicists have a word for everything, don't you?

By Flathead Phillips (not verified) on 25 Jan 2009 #permalink

A neat trick! I don't think I've seen that before. Of course, in all the math classes I do, the emphasis is never never never on calculation. Which makes me a little sad. Some calculation techniques seem fascinating.

Of course, the above seems to work optimally when the oscillating part is anti-differentiable. But what would you suggest for something like Sin(1/x)?

Have you thought about using something related to the Fourier Transform (or FFT) for a Sunday function?

I've recently started to wet my feet in DSP, of which I haven't seen much since college, and I was surprised at how much I had forgotten. You've got a definite knack for explanation that might save others the headache that I went through. I know that I'll be looking through past functions to see if there's anything anything simple and interesting enough to play around with, as your examples are clearer than the ones in my old reference books so far.