SL4 meets "Pinky and the Brain"

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jul 16 2002 - 04:10:47 MDT


James Higgins wrote:
> Mike & Donna Deering wrote:
> > a human. Or for that matter the status of an AI programmer. How do we
> > know that Eliezer isn't trying to take over the world for his own
> purposes?
>
> Um, what else do you think Eliezer IS trying to do if not take over the
> world for his own purposes? That may not be the way he states it but
> that is obviously his goal. His previously stated goal is to initiate
> the Sysop which would, by proxy, take over the world. He states that
> this is to help all people on the Earth, but it is still him taking over
> the world for his own purposes.

James, this is pure slander. I've stated before and will state again
that the Sysop Scenario was one bright stupid idea that I sincerely wish
I'd never come up with, since it seems to take over people's minds
faster than a transhuman AI with a VT100 terminal. The Sysop Scenario
is the Singularitarian equivalent of the Sixties Syndrome or the
Jetsons; futurism couched in human terms, easy to imagine and therefore
probably flat wrong.

What does matter, and does not change irrespective of whether the result
of seed AI is a Sysop Scenario or the first very powerful transhuman
helping the world make a smooth transition to a civilization of equally
powerful transhumans, is whether the mind you build has *your own* goal
system, or whether the result is the same mind that would be built by
*any* AI project which shares the moral belief that the outcome of the
Singularity shouldn't be made to depend on who implements it. This
holds true whether it works because of objective morality, or because of
a "subjective" yet clearly widely shared belief that certain kinds of
personal morality should *not* be transferred into the first transhuman
AI. I clearly spend a great deal of time worrying about this. Ben, on
the other hand, has stated in clear mathematical terms his intention to
optimize the world according to his own personal goal system, yet you
don't seem to worry about this at all. I'm not trying to attack Ben,
just pointing out that your priorities are insane. Apparently you don't
listen to what either Ben or I say.

Oh, well. I've been genuinely, seriously accused of trying to take over
the world. There probably aren't many people in the world who can put
that on their CV.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT