From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Dec 13 2003 - 11:40:34 MST
Wired has recently run an article on "affective computing" (which, please
note, is not even remotely related to FAI) about detecting and simulating
emotions. The article is about a chatbot named Laura, designed to
encourage its user to stick to an exercise program.
http://wired.com/wired/archive/11.12/love.html
One particular quote in this article interests me, because I've been
expecting it, but not so early:
> Everybody should have someone like Laura in their lives. I find myself
> looking forward to our time together. She asks me which movies I've
> seen, what my favorite cuisine is, and about the weather "out there." I
> tell her it's terrific. She responds: "It's always the same in here.
> Day in, day out."
You know how sometimes people look back in history, and point to some
small thing like, oh, say, the early Mosaic web browser, and go on about
the unpredictability of the future and how nobody at the time could
possibly have recognized the coming impact from such a small hint?
Watch this space for further developments. This is an incredibly early
form of the technology and I don't expect problems for at least a decade,
but when it hits it will hit hard.
This has nothing to do with AI; it's about programs with incredibly
realistic graphics and means for recognizing emotions in their targets,
being able to deploy apparent behaviors that act as superstimuli for human
emotional responses. Think of chocolate chip cookies for emotions.
Chocolate chip cookies are a more powerful stimulus than hunter-gatherer
tastebuds ever encounter, combining sugar, fat, and salt in greater
quantity and purer quality. And likewise there's a limit to the sympathy,
support, approval, and admiration humans can expect from their human
mates. As any evolutionary theorist knows, a human male is not designed
as the human female's ideal boyfriend, nor vice versa.
Candy bars for the soul. It's not that all synthetic foods are bad. A
polymath dietician, anthropologist, evolutionary theorist, and metabolic
biologist - that is to say, a *good* paleodiet theorist - can take a shot
at crafting synthetic foods that are good for you. But it takes so much
more knowledge to do it right... and the side effects of the things that
"just taste good" are negative, complicated, very hard to understand,
unforeseen in advance. People at large understand the one *obvious* side
effect once they've seen it: People bloating up like balloons. But also
losing insulin sensitivity, and a lot of other problems that aren't
visible to the naked eye.
At the very least it would take far greater skill, wisdom, knowledge to
craft a Laura that made people stronger instead of weaker. How many
decades did it take to go from candy bars to health food bars? Which is
cheaper? Which is more popular? And worst of all, which tastes better?
I could be surprised, but what Laura presages is probably NOT a good
thing. Transhumanism needs to lose the optimism about outcomes. Nobody
is taking into account the fact that problems are hard and humans are
stupid. Watch this space for serious developments in some unknown amount
of time, my wild guess being a decade, and quite possibly nothing
happening if "faking it well" turns out to be AI-complete.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT