Now that’s an attention-getter!

It comes from Ted Chiang‘s Big Idea post on John Scalzi’s blog Whatever. It’s a promotional piece for Chiang’s latest book, The Lifecycle of Software Objects, which is about artificial intelligence.

For those of you that haven’t heard of him, Chiang is one of the real breakout science fiction writers of the last two decades or so; his stories have consistently won both awards and the highest praise from reviewers and critics. This is his longest work to date. (His first collection is Stories of Your Life and Others, which has many of his most famous stories.)

A couple of choice quotes from the Big Idea!

It’s been over a decade since we built a computer that could defeat the best human chess players, yet we’re nowhere near building a robot that can walk into your kitchen and cook you some scrambled eggs. It turns out that, unlike chess, navigating the real world is not a problem that can be solved by simply using faster processors and more memory. There’s more and more evidence that if we want an AI to have common sense, it will have to develop it in the same ways that children do: by imitating others, by trying different things and seeing what works, and most of all by accruing experience. This means that creating a useful AI won’t just be a matter of programming, although some amazing advances in software will definitely be required; it will also involve many years of training. And the more useful you want it to be, the longer the training will take.

*snip*

And that’s what I was really interested in writing about: the kind of emotional relationship might develop between humans and AIs. I don’t mean the affection that people feel for their iPhones or their scrupulously maintained classic cars, because those machines have no desires of their own. It’s only when the other party in the relationship has independent desires that you can really gauge how deep a relationship is. Some pet owners ignore their pets whenever they become inconvenient; some parents do as little for their children as they can get away with; some lovers break up with each other the first time they have a big argument. In all of those cases, the people are unwilling to put effort into the relationship. Having a real relationship, whether with a pet or a child or a lover, requires that you be willing to balance someone else’s wants and needs with your own.

I really need to get myself a copy of that book!

(And yes, you’ll have to head over to Scalzi’s blog to see the context of the title quote!)

Comments

  1. #1 John
    August 10, 2010

    “Suppose you had a digital simulation of Paris Hilton’s brain…”

    A simulation of a simulation?

  2. #2 Aaron
    August 10, 2010

    You mean “2”?

  3. #3 blf
    August 10, 2010

    Suppose you had a digital simulation of Paris Hilton’s brain…

    /dev/null
  4. #4 Jack
    August 11, 2010

    Very interesting but I’m just wondering what is the point of providing that few years of training to a robot only to tech it make scrambled eggs… If it’s not profitable, it’s not going to be developed. And it’s not profitable as it’s cheaper to pay humans from poor countries to do the work.. We are ‘homo economicus’.

  5. #5 Vex
    August 15, 2010

    And if an AI takes years to train, a good way to get a human to invest that kind of time is to create an emotional bond between the two.

    Umm, Ted?

    There’s this thing called “money”, you may have heard of it before…

Current ye@r *

eXTReMe Tracker