Bose Condensation of Coffee?

Writing up the evaporative cooling post on cold atom techniques, I used the standard analogy that people in the field use for describing the process: cooling an atomic vapor to BEC is like the cooling of a cup of coffee, where the hottest component particles manage to escape the system of interest, and what's left behind is colder. The departing atoms or coffee molecules carry off more than the average energy per particle, leaving a lower average energy (and thus a lower temperature) for the remainder.

A question that sometimes comes up when I talk about this is how you can possibly use this to get a substantial reduction in temperature, given that coffee doesn't really cool down by all that much-- you don't see people's drinks Bose condensing at Starbucks when they fall down the YouTube rabbit hole and leave them sitting untouched for too long, after all. The answer (at least part of it) is that when AMO physicists do evaporative cooling, they can continue to lower the energy threshold at which "hot" atoms are removed from the sample. In the coffee analogy, the energy threshold is fixed-- it takes some amount of energy to get a molecule of water out of the liquid phase. The re-thermalization of the remaining liquid will bump a few of the leftover molecules up to sufficient energy to get them out, but as long as you're dealing with a finite sample, you'll eventually hit a point where there just aren't any molecules at that energy to remove, so the evaporative cooling process cuts off.

Which got me to idly wondering whether you could actually do this with hot liquid. That is, would it be possible to substantially cool a container of hot liquid by evaporative cooling through somehow lowering the energy at which molecules can leave the system. For purposes of idle speculation, we'll say that "substantially cool" means "reduce to a temperature below room temperature, assuming excellent thermal isolation of the container from the environment, in a time less than would be required to reach that temperature by ordinary cooling processes."

Of course, there is a well-known way to do this kind of thing: you reduce the pressure. The boiling point of water depends on pressure, dropping to a bit above room temperature at a pressure of 20-30 torr, which is not a particularly impressive vacuum by experimental physics standards. So, in principle, it seems like you ought to be able to do it-- stick a beaker of hot water with a thermometer in it into a bell jar, and start pumping air out. If you could do it right, maybe you could get a substantial reduction in temperature at the cost of throwing a lot of your liquid away, just the way that AMO physicists do in making BEC.

And, of course, you can find videos demonstrating this sort of thing, such as this one, where Edwards Vacuum freezes a bunch of water by pumping on it:

This is the same basic physics, but cast in different language than we usually use when talking about evaporative cooling of BEC, where you try to do the reduction somewhat more slowly. It's cast as making a change in the boiling and freezing points of water-- at low pressure, the beaker of water bubbles dramatically because the whole thing is now above the boiling point, which is dropped below the temperature of the sample.

The question I was wondering about is-- and I don't know if this is a sensible question-- whether you can lower the temperature of the liquid by evaporation without the dramatic boiling, in a manner analogous to evaporative cooling in ultra-cold atom experiments. That is, rather than lowering the boiling point for the entire sample quickly, making the bulk boil, lower it slowly, so that only the "hot" molecules escape, but those with lower energy happily remain in the liquid phase. Which might involve losing less of the liquid to boiling (or maybe more; I'm not sure how you would calculate that).

The physics of the phase change complicates this whole business, of course-- thermodynamics was not one of my better subjects-- and there's also a lot more than just evaporation going on. A real cup of coffee cooling will lose heat not just through evaporation, but also direct conduction through the container and convection of the air directly about the liquid (this is, after all, why my insulated Starbucks cup works well to keep my tea hot). I don't know how big a factor the evaporation actually is relative to those. And, of course, when they boil/freeze water in vacuum, the container is in contact with a large vacuum apparatus at room temperature, which will tend to make it hard to change the temperature of the liquid.

This makes me want to go in the lab and start screwing around with vacuum pumps and thermocouple gauges, though. Happily, I let the Mechanical Engineering depertment borrow our big bell jar, so I can't easily add yet another project to the loooooong list of interesting physics things I don't have time to actually do.

(The "featured image" at the top is taken from this discussion at the Physics Stack Exchange, which takes on a vaguely related question in a fun way, including a simple experiment.)

More like this

In our last installment of the cold-atom toolbox series, we talked about why you need magnetic traps to get to really ultra-cold samples-- because the light scattering involved in laser cooling limits you to a temperature that's too high for making Bose-Einstein condensation (BEC). This time out,…
Consider the air around you, which is hopefully at something like "room temperature"-- 290-300 K (60-80 F). That temeprature is a measure of the kinetic energy of the moving atoms and molecules making up the gas. At room temperature, the atoms and molecules in the air around you are moving at…
The Joerg Heber post that provided one of the two papers for yesterday's Hanbury Brown Twiss-travaganza also included a write-up of a new paper in Nature on Mott insulators, which was also written up in Physics World. Most of the experimental details are quite similar to a paper by Markus Greiner's…
While I'm sure there will be a lot of chatter around here in the next few weeks about the vacuum (or, God help me, vacua), I feel like I should lay the groundwork by talking about laboratory vacuum. I know I'm here to talk about cold atoms and the hot stuff going in in experimental physics right…

I worked on a similar problem as part of my thesis. The turbulent boiling behavior actually happens where the liquid touches the container (the bubbles nucleate on some scratch or other imperfection). So, you could suspend the liquid in zero-g so there's no solid container, or on Earth you could cool the container while you're lowering the pressure such that the temperature of the container's inner surface stays below the boiling point.

“reduce to a temperature below room temperature, assuming excellent thermal isolation of the container from the environment, in a time less than would be required to reach that temperature by ordinary cooling processes.”

It seems to me high vacuum (which rules out liquid water), is necessary to make that sort of evaporative cooling practical, otherwise the energy carried off by molecules leaving the surface won't get far before thermalization and so the temperature won't drop below that of surrounding gas.

I definitely agree that at some point, you would need a decent vacuum to get substantial thermal isolation from the environment. It's not obvious to me how quickly that would come up-- it seems that it would depend on the rate at which heat flows in and out of the system via conduction through the containers and air.

Regarding the boiling, what I'm wondering is whether you can do substantial cooling without bringing the bulk of the liquid below the boiling point, in analogy with the evaporation done in BEC systems. The demo video, and ost of the others like it, just crank the pressure down quickly, because they're going for a nifty visual, and the rapid boiling is a plus. But if you're cooling a rubidium cloud to the BEC transition, you don't do it by dropping the RF "knife" frequency immediately to a temperature below the average of the sample-- that just throws most of the atoms away, without doing efficient cooling. As I've always understood it, you keep the knife a bit above the actual temperature, so you throw out just the hot tail of the Maxwell-Boltzmann distribution, leaving the majority of the sample alone.

The analogue here would be to keep the pressure at a value that puts the boiling point (which I'm guessing serves as an approximate measure of the energy at which a molecule can be ejected from the liquid) just a bit above the actual sample temperature, so you pump away the "hot" fraction of the liquid, but never have a majority of it at a temperature above the boiling point. In which case, you shouldn't be nucleating bubbles and producing boiling.

I don't know exactly what the optimum evaporation point is, in terms of the average energy, though. I never really needed to go into that much detail with the evaporation process-- when I did the squeezed state stuff, we just optimized it on an empirical basis, and left it at that. There's probably some paper from the early 1990's that spells it all out, though. I should go look for that, in my copious free time.

I don't have it a hand at the moment, but in an old book on physics I read about a nice demonstration of how to freeze mercury with ether by sticking it in a hot oven.