Radiative equilibrium is one of the foundation stones of radiative forcing theory. But it is not a law of physics, only a rather archaic and untested supposition found in climatology textbooks alone.
“For the Earth to neither warm or cool, the incoming radiation must balance the outgoing.”
It’s best to regard radiant energy simply as a finite power source — indeed, that power is expressed as watts per square meter. An object is said to “cool” by radiating, yet this would seem to imply that restricting its radiation will make it get hotter and hotter. That’s the very premise of greenhouse theory, of course, that by disturbing outgoing radiance any magnitude of temperature gain is possible. But this is easy to test.
Confine a lightbulb inside an infrared barrier (like a globular mirror) and electrically feed one watt to it. After a while, will it be generating the heat of a thousand watt bulb? No.
When its temperature is consistent with the input, further heating stops.
It’s like water seeking its own level. Lacking any means to radiate to its surroundings, the lightbulb merely gets as hot as a watt of power can make it, which is not much hotter than what it would be in the open. If not, we’d be able to generate incredible temperatures very cheaply. Just confine, wait, and release.
Conservation of energy: it’s not just a phrase. The theory of radiative equilibrium arose early in the 19th century, before the laws of thermodynamics were understood.
Update: Jennifer Ouellette of Cocktail Party Physics suggests LOLdenialists: