On the Singularity

i-ecccfdab8a3f05cf61a759fffc134bfb-eliezer.jpg

Tags

Can't listen to it but: Rapture for nerds.

By John Emerson (not verified) on 07 Jun 2008 #permalink

Yudkowsky seems pretty reasonable with his points about sloppy futurism, cognitive biases, and probability distributions. Here's a quote from his blog

I think Roger Schank gives the game away when he says:

"When reporters interviewed me in the 70's and 80's about the possibilities for Artificial Intelligence I would always say that we would have machines that are as smart as we are within my lifetime. It seemed a safe answer since no one could ever tell me I was wrong."

There is careful futurism, where you try to consider all the biases you know, and separate your analysis into logical parts, and put confidence intervals around things, and use wider confidence intervals where you have less constraining knowledge, and all that other stuff rationalists do. Then there is sloppy futurism, where you just make something up that sounds neat. This sounds like sloppy futurism to me.

Seems like he's on the right path to avoid some of the errors of a previous generation of futurists.

I'm very pessimistic in this aspect, we already invest a lot of resources in almost totally dumb machines and get little from them (example: we use, and feed and repair, the car but we only get greater distances to travel, traffic-jams and ecological problems, not more time of leisure, maybe even less). The trend so far has been to build up a technological society in which everything orbits around machines, now we even grow human food to make biofuels... And eventually a human-like or super-human intelligent machine will be built. I have no idea what kind of "emotions" and "desires" it will have but certainly it will have its own interests and those won't be alient to its possibly cold logic mind. Humans then will be obsolete.

Consolation prize: the new machines will be somewhat human-like. But that's all.

I really don't see how such developement may help humans at all. And I don't understand the geek slang (the very term "Singularity" aplid out of astrophysics is totally new to me) that seems to claim that such developement will be beneficial. It will be not - not for humans, certainly. They will have their own interests and will manage to impose them. We will never be able to survey and control them without help from machines. It's the perfect self-kicking one's butt.

A brief, almost illusory and certainly out of control Anthropocene followed by a Mechanocene, whose parameters I can't really imagine. If that's all the well we can do, then maybe we deserve to end in that really stupid way: destroyed by our own creations.

Has it ever occurred to people that when Jesus spoke of 'it' coming in the night, and taking this person but leaving that person behind, that he was speaking of death? Thus people have been 'raptured' all along, and people will be 'raptured' in due time.

One more thing; when did the Bible become without flaw or error?

The transhumanists I've run into have tended to be overenthusiastic about the singularity, they seem to think it was inevitable, and the only debiassing they did was clear away bias against the singularity, or against technology. Some were global-warming deniers; others were global-warming fatalists; most seemed to deny the possibility or desirability of government action, from a libertarianesque perspective; some were enthusiasts for space colonization, on the grounds that there was no hope for earth; most seemed to be intense tech buffs and believers in technical fixes for all problems.

I'm sure that there others not having these traits.

By John Emerson (not verified) on 08 Jun 2008 #permalink

You're on TV, on the internet man... Look, I know you're a fucking nerd, but jesus, some fashion, please. That green shirt looks like it cost under 5 dollars. Eliezer, you know how many little Chinese fingers were nipped so you could dress like a schmuck on television?