Sunday Function

On the surface it's one of the most boring possible functions. Two straight horizontal lines: the Heaviside step function.

i-635a426c55f504bb7e0ac633036271c9-step.png

Usually denoted θ(x), it's equal to 0 if x is less than zero, and it's equal to one if x is greater than zero. At x = 0 exactly it doesn't really matter what it is for physics purposes so it's generally taken to be 1. Mathematicians tend to define it as 0.5 at x = 0, but they're kind of weird.

Believe it or not, there's some pretty good reasons for having the Heaviside function around. Walk into a dark room and flip on the lights. Before you hit the switch, the electrical current was a constant 0. After, it's a 60 Hz alternating sine wave. You can express the whole thing by saying that the current is described by that sine wave multiplied by the step function. And since the step function is in fact a nice function with mathematically well-defined properties you can do calculus on the moment of transition.

Take for instance the harmonic oscillator - say, a weight oscillating on a spring. At time t = 0 you turn on a frictional force. With a little bit of math you can write the differential equation in terms of the step function and find an exact mathematical expression for the behavior of the system. It's going to look kind of like this:

i-18d0d6ef12d1484f34ae8636af1a3952-step2.png

There's one kind of behavior before t = 0, and a different kind after. The step function formalism is a little superfluous in this slightly contrived case from a problem-solving standpoint, but it's a good example of the type of situation where you have to deal analytically with an instant change in the behavior of a system.

Like all math, even the easy-seeming stuff usually has deep waters nearby. The Heviside step function is just a stone's throw away from the Dirac delta function, which (so to speak) represents whacking something with a hammer rather than flipping on a switch. We'll save that for later.

More like this

After the amazing response to my post about zero, I thought I'd do one about something that's fascinated me for a long time: the number *i*, the square root of -1. Where'd this strange thing come from? Is it real (not in the sense of real numbers, but in the sense of representing something *real*…
I'm away on vacation this week, taking my kids to Disney World. Since I'm not likely to have time to write while I'm away, I'm taking the opportunity to re-run an old classic series of posts on numbers, which were first posted in the summer of 2006. These posts are mildly revised. After the…
One of the fundamental branches of modern math - differential and integral calculus - is based on the concept of limits. In some ways, limits are a very intuitive concept - but the formalism of limits can be extremely confusing to many people. Limits are basically a tool that allows us to get a…
In my post about how we know photons exist, I make reference to the famous Kimble, Dagenais, and Mandel experiment showing "anti-bunching" of photons emitted from an excited atom. They observed that the probability of recording a second detector "click" a very short time after the first was small.…

I was always just amused by the way the function's name so perfectly describes it despite being just the name of the guy who invented it.

A bit like the Poynting Vector in that sense.

It's actually spelled "Heaviside".

Yes it is. Thanks, and sorry for the typo!

By Matt Springer (not verified) on 15 Feb 2009 #permalink

He was an odd character, Oliver Heaviside, but also a great scientist.

Back in the last crash (around 1999), one of the engineers I worked with, a Dr. Len Bieman, made an observation that I thought was wise and funny. But you had to be an engineer, or have read the above explanation, to get it:

"Economies don't like step functions."

Not to mention the step function's use in Laplace transforms. Very important for us engineers.

Actually, I've found that one of the more enlightening uses of these functions is in the analysis of beam bending. Beam loads can be idealized as point loads, uniform loads, triangle functions (e.g. hydrostatic pressure!), etc. -- all of which are conveniently represented by singularity functions. In mathematical physics, for some reason we usually give these functions different names (delta(x-a), theta(x-a), etc.) but in the analysis of beams all are treated as members of a family ^k where the brackets <> in this case mean that the function is zero when the argument is negative. For instance, when we integrate from -infinity, successive integrals of ^(-1) (i.e. delta(x-a)) are simply ^0, ^1, (1/2)^2.... In beam bending one has to do four successive integrals (load to shear to moment to slope to displacement), so this notation saves a lot of "casing".

BBB

Oops... the last post got messed up when the parser tried to interpret all my brackets as HTML tags. Let me try again...

Actually, I've found that one of the more enlightening uses of these functions is in the analysis of beam bending. Beam loads can be idealized as point loads, uniform loads, triangle functions (e.g. hydrostatic pressure!), etc. -- all of which are conveniently represented by singularity functions. In mathematical physics, for some reason we usually give these functions different names (delta(x-a), theta(x-a), etc.) but in the analysis of beams all are treated as members of a family \^k where the brackets \<\> in this case mean that the function is zero when the argument is negative. For instance, when we integrate from -infinity, successive integrals of \^(-1) (i.e. delta(x-a)) are simply \^0, \^1, (1/2)\^2.... In beam bending one has to do four successive integrals (load to shear to moment to slope to displacement), so this notation saves a lot of "casing".

BBB

In an older "Sunday Function" (exponential), I mentioned the books of Paul J. Nahin. His biography of Oliver Heaviside has been recently republished by Johns Hopkins Press in paperback. Worth a look, and if you have the time, a read.

By David Derbes (not verified) on 01 Mar 2009 #permalink