In the Stealth in Space post earlier this week, we discussed the problem of detecting the thermal emission from a spacecraft. If the interior isn’t generating a lot of power, there’s not much thermal radiation being emitted, making it a tough job to detect.

But it was pointed out in the comments that the heat from the sun would itself warm the spacecraft exterior, increasing the thermal signature by potentially a large amount. Let’s verify this. If you take a perfectly absorbing sphere and set it in orbit around the sun (say, at the distance of the Earth), it’ll absorb all the light from the sun that intersects its cross-sectional area. But assuming it conducts heat well, it’ll be radiating equally in all directions – the area of emission will be the whole surface of the sphere. Thus at equilibrium, the power in from the sun equals the power out via radiant heat.

The power in is the power-per-area from the sun (about 1300 W/m^2 at the distance of the earth) time the area of cross-section, and the power out will be equal to the thermal-power-per-area given by the Stefan-Boltzmann law times the whole area:

Where Ac is the area of cross section, At is the total surface area, capital phi is the flux from the sun, and sigma is the Stefan-Boltzmann constant. For a uniform sphere, we can plug in the area:

Solve for T:

Notice the r’s have canceled; the final temperature is independent of the size of the sphere. With our numbers, the temperature is a toasty 275 K. Well, toasty with respect to space. You’d still want to wear a jacket.

But that’s a pretty hefty temperature if you want to stay stealthy. In the comments I suggested making the spacecraft out of reflective or transparent materials, both of which have potentially serious problems. It would probably be a better bet to just tweak our formula above by simply giving our spacecraft a convenient shape. If for instance we shaped our spaceship like a pencil and kept either the point or the eraser aimed at the sun, there would only be a small fraction of our total radiating area available to absorb sunlight. I’ll leave the equation-modification aside and quote some results: if there’s 100 times as much radiating area as there is absorbing area, the temperature is down to 123 K. If there’s 1000 times, it’s down to 69 K. Returns are diminishing (it scales as the inverse fourth power of the ratio), but still significant. That combined with some judicious reflectors and I think you can get the hull temperature pretty low.

*Fig 1: Pencil-shaped spacecraft design, from 2001*

The problems the sun’s heat poses are also strongly influenced by where you’re trying to be stealthy. Around Mercury there’s a lot more solar light to deal with. Around Saturn, much less. All other things being equal, your hull temperature scales with the inverse square root of your distance from the sun. At Saturn, for instance, our 100x area blackbody spacecraft would have a hull temp of a frigid 39 K.

So it’s something to think about if people ever come to blows over mining Ceres or something.

UPDATE: In the previous post, commenter Anthony brings up something well worth discussing here:

…if a ship is being hit by X watts of sunlight, it’s pretty much going to emit X watts worth of photons (unless it has somewhere to store the heat, and it usually only takes hours to days to overwhelm any practical heat sink), and tweaks to albedo and heat distribution across the surface just modify the spectrum and direction of the emissions.

Which is pretty important, lest we fall into the same trap I criticized in my previous post. In the end, total incoming = total outgoing. Lowering the temperature is good for reducing that part of the total emissions which are promiscuously broadcast in all directions. Hopefully you can catch most of the rest with a mirror and reflect it in a narrow, specific direction away from enemy sensors.