Singularity Summit

ScienceBlogs' Razib is back from the Singularity Summit, with summaries of a good portion of the proceedings and some interesting links. Relatively recently I've written about why the Singularity very well may not happen (at least not the wilder version) - roughly, the growth curve of technology may be logistic rather than exponential thanks to the limiting effect of physical laws.

Probably the most passionate proponent of the Singularity is Ray Kurtzweil, and according to a piece Razib links, he disagrees. (The piece itself is somewhat old, but apparently he presented an updated version at the conference.) I find his disagreement profoundly unconvincing, but interesting enough to comment upon. Let me quote the first few sentences before getting to the meat, much later in the essay:

An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense "intuitive linear" view. So we won't experience 100 years of progress in the 21st century -- it will be more like 20,000 years of progress (at today's rate). The "returns," such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth.

Yeesh. That surely made the mathematically inclined readers wince. Of course there's exponential growth in the rate of exponential growth. That's the definition of exponential. The growth of the growth of the growth of the growth [repeat as desired] will be exponential too, for as far as you want to carry it. That's just the way any exponential function works. Moving on, and skipping way down to the relevant bit:

Why Intelligence is More Powerful than Physics
As intelligence saturates the matter and energy available to it, it turns dumb matter into smart matter. Although smart matter still nominally follows the laws of physics, it is so exquisitely intelligent that it can harness the most subtle aspects of the laws to manipulate matter and energy to its will. So it would at least appear that intelligence is more powerful than physics.

Counterexample: the most powerful intelligence in the world, human or computer or (for that matter) God himself, cannot beat me in Tic-Tac-Toe without cheating. The rules simply don't allow any room for increased intelligence to result in increased performance. Perfect play requires only childlike intelligence plus a few games worth of experience. The laws of the universe are not fully known and in any case are much more complicated. But we have no reason to believe they can be bent at will by sufficient smarts. If our current knowledge is correct, plenty of things will remain impossible - breaking lightspeed, breaking the Heisenberg uncertainty principle, violating the conservation laws, and so forth. It might not be correct, but it seems a stretch to assume that the rules cease to apply when a genius examines them.

Perhaps what I should say is that intelligence is more powerful than cosmology. That is, once matter evolves into smart matter (matter fully saturated with intelligence), it can manipulate matter and energy to do whatever it wants. This perspective has not been considered in discussions of future cosmology. It is assumed that intelligence is irrelevant to events and processes on a cosmological scale. Stars are born and die; galaxies go through their cycles of creation and destruction. The Universe itself was born in a big bang and will end with a crunch or a whimper, we're not yet sure which. But intelligence has little to do with it. Intelligence is just a bit of froth, an ebullition of little creatures darting in and out of inexorable universal forces. The mindless mechanism of the Universe is winding up or down to a distant future, and there's nothing intelligence can do about it.

That's the common wisdom, but I don't agree with it. Intelligence will be more powerful than these impersonal forces. Once a planet yields a technology creating species and that species creates computation (as has happened here on Earth), it is only a matter of a few centuries before its intelligence saturates the matter and energy in its vicinity, and it begins to expand outward at the speed of light or greater. It will then overcome gravity (through exquisite and vast technology) and other cosmological forces (or, to be fully accurate, will maneuver and control these forces) and create the Universe it wants. This is the goal of the Singularity.

At this point one must genuflect in the direction of the greatest science fiction short story every written, but I'm pretty sure it will remain fiction. Why in the world should gravity be able to be overcome, any more than the rules of Tic-Tac-Toe? There's much we don't understand about gravity on the cosmic and microscopic scales, but there's a whole bunch we do understand. It points in the direction of mathematical impossibility to simply overcome it at will.

I am optimistic about the future of technology, and I'm sure it will change the world in many profound and interesting ways. Science and technology have a lot of room left to grow. But I'm pretty sure it's not infinite. If the effort to rule the laws of the entire cosmos were traded on Wall Street, I'm pretty sure I wouldn't be investing.

More like this

Moving on from simple zero-sum games, there are a bunch of directions in which we can go. So far, the games we've looked at are very restrictive. Beyond the zero-sum property, they're built on a set of fundamental properties which ultimately reduce to the idea that no player ever has an…
I was wondering to myself what posts from 2006 still get significant views. After poking around in the statistics, here is what I found (in order of visits as of May 22nd): pictures of polar bears & mutant beasts from Maine, penis parasites from Grey's Anatomy, Ann Coulter as dashund (I was…
One application of graph theory that's particularly interesting to me is called Ramsey theory. It's particularly interesting as someone who debunks a lot of creationist nonsense, because Ramsey theory is in direct opposition to some of the basic ideas used by bozos to purportedly refute evolution…
A big thank you for the folks who contributed to our little portion of that wonder of a wonder, the scienceblogs.com DonorsChoose challenge. The SCQ also recieved a number of stellar Haiku's which will be used in the Haiku Phylogeny project. Here is a small sampling of them (feel free to submit…

We have copious examples of singularity events in history. Jared Diamond covers several of them in Collapse. If you're hoping for a singularity, just wait, you're likely to get one, good and hard.

By Nathan Myers (not verified) on 08 Oct 2009 #permalink

There's even exponential growth in the rate of exponential growth.

He's probably thinking that e to the e'th power is, like, exponentially exponential, man. *puff*

Counterexample: the most powerful intelligence in the world, human or computer or (for that matter) God himself, cannot beat me in Tic-Tac-Toe without cheating.

But we "cheat" all the time. You can't send data faster than the speed of light, but you can compress the data. You can't eliminate gravity, but you can use buoyancy and other balancing forces and even the curvature of the earth to fly. You can't create or destroy matter or energy, but you can conserve it, you can hide it, you can convert it from one form to another... all sorts of tricks to "cheat" how physics is supposed to work.

I mean, the idea that scientific advancement will advance at an exponential rate until we all become eternal omnipotent gods is pure wishful thinking.

But there's so much shit we still don't know. Maybe we won't be star hoping to the edge of the universe, but if we're terraforming Mars in 2109, I'll consider that a respectable leap. Superluminal travel might be impossible, or it might just be waaaaaaaaaaaaaaaaaaay harder than we original thought.

I'll start worrying about the end of science when I get my flying car.

Well, like I said before, Kurzweil is kind of a crank. You should try taking on Eliezer Yudkowsky (http://yudkowsky.net).

The intelligence explosion idea is much more powerful and compelling than the exponential growth advocated by Kurzweil.

As I understand Kurzweil what he means is that the growth rate can be approximated well as a close to exponential function with the base growing slowly. That is, log f is slightly superlinear. Or to put it another way, f'/f -> oo as x -> oo but does so at a slow rate. He hasn't done a good job of stating this precisely but judging from graphs that he has drawn and his various examples that's what he seems to mean.

At this point one must genuflect in the direction of the greatest science fiction short story every written, but I'm pretty sure it will remain fiction.

Asimov mapped out one path to the singularity, but I think Fredric Brown's vision was closer to what most singularity buffs have in mind.

Progress arises from wealth - surpluses redirected to fuel apparently unproductive intellectual endeavors. There is $100 trillion of US Federal debt. All "surpluses" are directed toward gorging genetic, developmental, and behavioral trash; reproductive warriors, religious hind gut fermenters, drug addicts, Enviro-whiner Luddites; the stupid, the pathetic, and the Officially Sad.

The only singularity to come will be serfs pounding reinforced concrete with rocks to retrieve rebar. God's dominion of poverty, hunger, disease, filth, death, and silk-clad priests with whips will be spiced with tales of supermarkets in heaven.

October 2009
http://www.cnn.com/2009/WORLD/asiapcf/10/06/afghanistan.us.deadly.fight…
"The base, in an eastern Afghanistan valley, was surrounded by ridge lines where the insurgents were able to fire down at U.S. and Afghan troops."

April 1954
http://en.wikipedia.org/wiki/Battle_of_Dien_Bien_Phu
"Viet Minh's possession of heavy artillery (including anti-aircraft guns) and their ability to move such weapons to the mountain crests overlooking the French encampment."

Is there anybody in the Pentagon who knows anything about field operations from a winner's point of view? First derivatives sell books, second derivatives burn cities. Know whether extrema are sulci or gyri.

That's just the way any exponential function works.

No. Kurzweil is stating that e^{kt} has k itself a slowly increasing function of t.

Kurzweil may be many things, but he is definitely not a nontechnical guy and certainly more mathematically competent than the vast majority of people.

This is just a trivially picky mathematical point but it seems to me that the way to translate the idea of "exponential growth in the rate of exponential growth" is as the math expression exp(exp(rt)) or e^e^rt, not as suggested above as exp(k(t)) or e^k(t) with k(t) as some, possibly gently, increasing function of t.

The growth behavior of this exponential of an exponential, for a plausible value for r of about 0.05, becomes so ridiculous in a couple of human generations that the point about hitting some type of barrier and conversion to a logistic-type curve seems inescapable.

You don't have to subscribe to some goofy Jay Forrester/Club-of-Rome limits to growth theology to reach this conclusion -- you soon get to increases like going from the LHC at Cern to colliders bigger than the diameter of the Solar System, and getting the LHC funded was a struggle -- go visit Waxahachie, TX and the cancelled SSC.

This singularity stuff is singularly naive: you get to nonsense like Frank Tipler's cosmology well before the Omega point.

By brian smith (not verified) on 10 Oct 2009 #permalink

This singularity stuff is singularly naive: you get to nonsense like Frank Tipler's cosmology well before the Omega point.

The "I"m paranoid about dying geek" Crowd. I wonder if kurswell ever has actually gotten laid. Nice dream we become the end user and the manufacturer at the same time. I wonder if there has ever been a single movie or story ever told that thought this was a good Idea. NO, because it's not efficient, it's as simple as that. self looped logic is all this is.

By david thurman (not verified) on 10 Oct 2009 #permalink

Nice dream we become the end user and the manufacturer at the same time. I wonder if there has ever been a single movie or story ever told that thought this was a good Idea. NO, because it's not efficient, it's as simple as that.

Plants basically function this way, and they've been getting along rather well for the last billion years or so.

here is my singularity scheme for getting rich in 30 days:

each day you ask some people for a single penny. on the first day you only ask for one penny. however, each successive day, ask twice as many people as the day before. it's that easy!!!!! by the end of the month you will be rich rich rich-and all you do is ask people for pennies!!!!

each day you ask some people for a single penny. on the first day you only ask for one penny. however, each successive day, ask twice as many people as the day before. it's that easy!!!!! by the end of the month you will be rich rich rich-and all you do is ask people for pennies!!!!

Plants Religions basically function this way, and they've been getting along rather well for the last billion 6000 years or so.

An analysis of the history of technology shows that technological change is exponential

Oh yeah? What units are you measuring "technological change" in, pray tell? How many "Kurzweils" are there between a palaeolithic hand axe and a neolithic discoidal knife, for example? Or to make it slightly more challenging, how many "Kurzweils" between that same palaeolithic hand axe and a completely novel technology - say, birch tar resin? (Birch tar was the epoxy of the neolithic.) What does "technological change is exponential" actually mean?

Other questions for Ray: What the hell happened between the Antikythera mechanism and the Renaissance? Have you ever heard of "the fall of Rome"? What about "the Dark Ages"? (Not that the Dark Ages were actually as "backward" as they are usually portrayed, but nevertheless...)

Monotonic progress? My ass...

While he doesn't go into this, you probably could come up with some kind of metric, or collection of metrics.

For instance, you could measure the maximum force capable of being exerted by a single average individual using peak technology. So, for instance, a 200 lbs adult male might exert 100 lbs of force with a hammer. With the invention of the fulcrum, he could exert 250 lbs. With the invention of domestication of livestock combined with - say - the wheel, he could exert 500 lbs. Then someone creates the catapult and can exert 1000 lbs. The steam engine maybe gives 10,000 lbs. The gas engine plus some engineering advances gives 100,000 lbs. And it all culminates in the NASA space program, where you've got people reaching escape velocity - or maybe the air force program that gave us supersonic jets.

You could put similar scalars on "how much we can see" and scale up all sorts of visual technologies from telescopes to litmus paper. Or you could look at the complexity of our chemistry - from the invention of fire, to alloys, to sythentics, and so on.

Give all these things weights and you could create some kind of general "technology" metric.

If it took the Romans 1000 years to go from 400 lbs of force to 800 lbs of life, but only 50 years for the Europeans to go from 1000 to 2000 lbs of force, and then it takes an American engineer 2 years in a lab to get us rocket technology, then we're talking exponential - maybe even better - degrees of development.

For instance, you could measure the maximum force capable of being exerted by a single average individual using peak technology.

"Give me a lever, a fulcrum, and a place to stand, and I can move the world." - Archimedes (c. 287 BC â c. 212 BC)

It sounds to me like you're trying to construct a metric that gives you the answer you want to hear.

The big question people overlook is: what does "technological progress" actually mean? Does the refinement of existing technology (say, going from a hand axe to a discoidal knife, or a PDP-11 to Deep Blue) really count for as much, if not more, than the invention of completely new technology? Personally, I tend to think that going from not being able to make fire at all to being able to make it at will was a bigger technological leap than going from Hero's Engine to the Saturn V rocket - but I can't see anyway of proving it. It all depends on what you mean by "progress".

Physicists, mathematicians and computer scientists need to sit back and study some actual living organisms. I suggest frogs and cats.

Artificial intelligence and automatic image analysis (the two are very closely related) have been ongoing science (fair?) projects for around 40 years, and NO progress has been made, various lying liars to the contrary notwithstanding. [The record of the tokamak clowns is equally bad and deceptive, and they have gotten much more money.]

First, modern image analysis programs do not even operate at the level of frogs, who only detect motion, but accurately enough to catch flying insects. No artificial intelligence program operates at any where near the level of a cat. If, then, else ... ad nauseum.

Cats and frogs not only catch and eat prey, they find their way around the world and automatically reproduce themselves. No computer comes anywhere near any of this behavior. And the programmers have no clue as to how to emulate it. We're being overrun by cats and Burmese pythons, not robots.

Computer nerds are very impressed by flops (pun intended). There now exist petaflop (pun intended) machines. I have seen graphs plotting flop evolution vs time, with some points labelled as mice, monkeys, etc. This is nonsense. No computer operates anywhere near the computation level of ANY animal.

I do not believe digital computers will ever become intelligent. Hubert Dreyfus (What Computers Still Can't Do) might agree, although the absolutism of my statement might be too much for him. The Singularity won't ever happen either. Kirzweil and his ilk are deluded.

By Bob Sykes (not verified) on 15 Oct 2009 #permalink