Puttin' on the Ritz Variation

Continuing from yesterday's post on approximation methods in quantum mechanics, here's another common method worth a close look. It's one of my favorites, because it's a rare technique in which you can just make something completely up from thin air and it will very probably work well nonetheless.

Let's say you have a particle floating around in some potential. Maybe it's a perturbed square well, an anharmonic oscillator, or just about any weird potential you can think of. You have no idea what the actual wavefunctions are for the potential. After all, even simple solvable potentials often have fairly horrible wavefunctions - the basic vanilla harmonic oscillator involves the Hermite polynomials inside a Gaussian envelope.

But with this method of Ritz variation (sometimes called Rayleigh-Ritz variation or similar) there's magic that can be done. You can make a wavefunction up out of thin air and use it to approximate the ground state of the potential. Here's how:

Call your made-up wavefunction by the Greek letter psi. Find the expectation value of the hamiltonian (which is the average energy of that state) in the usual way, and call that energy E. You'll do so in the usual way, with H the hamiltonian of the potential you happen to be dealing with:


Let's examine that closer. Expand the phi in terms of the unknown actual eigenstates of the potential, which we'll call n:


Since the n are eigenstates of H, the operator H acting on n will just give the energy of that state. We'll do that, and move the energy to the front of the sum to keep it neat:


Now of course the various En are the actual energy values of the actual wavefunctions which we don't know. They're arranged in ascending order, so each successive energy level will be greater than the one before. So if we just replace all the En with E0 (the ground state), we'll have reduced the total energy. That turns the equals sign into an inequality:


You are perfectly welcome to drop the expansion in terms of n, and in view of the first equation we have:


Which is our result, and a fantastically powerful one at that. If you're looking for the ground state energy of a potential, you can make up any old wavefunction you like, whether it's an eignefunction or not. Find the expectation value of the hamiltonian of that made-up function and you're absolutely guaranteed that it will be bigger than the actual ground state energy. So you can use any method you can beg, borrow, or steal in order to adjust your made-up function to have a smaller expectation value for the hamiltonian and you're guaranteed to close in on the real value of the ground state energy. You'll never get all the way there unless you happen to stumble across the actual wavefunction, but you will usually get shockingly close.

I owe a commentator a discussion on polarization as a follow-up to the light wave post, but as soon as that's out of the way (and assuming I don't waste Saturday on non-physics ephemera, which is pretty likely) I'll do an example of Ritz variation.

Quick notes: New readers, I promise we're not usually so technical around here. Stick around, I promise less eye-glazing material soon! Secondly, ScienceBlogs as a whole has been experiencing some problems as a result of Movable Type scaling issues during some upgrades and changes. Comments may occasionally bork themselves, but it should be rare and it should be fixed in a few days.

UPDATE: I left off a bunch of closed subscript tags, breaking the site in IE earlier. Should be fixed now. Thanks for letting me know!

More like this

This is the conclusion of the Ritz variation discussion we've been doing. Tomorrow we'll get back to less arcane and more entertaining physics, I promise! We're capping things off today by demonstrating the actual technique in detail for an example where we already know the answer. What we're…
I left off last time with a brief introduction to uncertainty, followed by two classes worth of background, both mathematical and Mathematica. Class 15 picked up the physics again, starting with an explanation of the connection between the Fourier theorem and uncertainty, namely that any attempt…
Yesterday we talked about how fermions and bosons had different values of spin and thus their wavefunctions had different symmetry properties. In particular, fermions are antisymmetric under exchange of particles. We'd like to write the overall two-particle wavefunction in terms of the individual…
A few days ago we looked at what a Lagrangian actually is. The short of it is that it's the kinetic energy minus the potential energy of a given mass*. More importantly, if you construct the classical action by integrating the Lagrangian over the time (see the previous link for a more full…

E< sub>n< sub> should be E< sub>n< /sub> (without the spaces.

And it's not just applicable to quantum mechanics. Any Sturm-Liouville type boundary value problem is amenable to approximation methods just like you've described. In fact, your method above using a single trial wavefunction to approximate a single energy (the ground state energy) can be generalized by using some collection of n trial functions to get approzimations to the first n stationary states and their corresponding energies. I used a similar method to find the eigenfunctions and eigenfrequencies of a vibrating plate, which are impossible to derive analytically except in a few, special cases of boundary conditions.

Math, you gotta love it.

can anyone read the comments? i don't have a microscope, so the font is a wee bit small...

The cool thing is, if you treat < psi | H | psi > as a functional and optimize it (subject to the constraint that < psi|psi > = 1), you get the Schroedinger equation back again.

dammit, the < feature bites me in the ass again!
Read as:

The cool thing is, if you treat < psi | H | psi > as a functional and optimize it (subject to the constraint that < psi | psi > = 1), you get the Schroedinger equation back again.

OK, I give up. Don't use Bra-Ket notation in the comments, even if the preview says it's ok.