Re: Si definition of Friendliess

From: Eliezer S. Yudkowsky (
Date: Fri Apr 06 2001 - 14:42:58 MDT

Chris Cooper wrote:
> Arona Ndiaye wrote:
> >
> >A 'Singularity for Dummies' sounds like a joke to me. Do not get me wrong,
> >but with all due respect: why should dummies need
> >to understand the Singularity ?
> This attitude is EXACTLY why it is important to explain these ideas to
> laypeople as we approach the Singularity. The kind of arrogant attitude that...

I would say that I want (a) a sufficiently large group of supporters to
succeed, and (b) a sufficiently small group of opposition that the project
isn't halted.

I feel that the rest of the planet should have the opportunity to find out
more about the Singularity if they become interested, because the
Singularity is a part of human destiny. But if someone wants to just not
know anything about it - shove the whole issue off onto someone else -
well, I may disagree with them, but that's their privilege. I no longer
have the attitude that people MUST care whether they like it or not. You
know the saying: A fanatic is someone who can't change his mind and won't
change the subject.

I just write informative articles. I've given up on (i.e., do not
currently plan on) determining the rates of memetic propagation for an
entire planet. Most likely, the issue will be determined largely by the
pro-technology and anti-technology factions fighting it out, a relatively
small percentage of the global population. There is a small probability
of a "Baylor Jihad" (antitechnology crusade), and a smaller probability of
a planetwide crusade for the Singularity.

I would LOVE to see a planetwide crusade for the Singularity. I would
LOVE to see the public really "getting" it, all of it, and doing the right
thing for the right reasons. And not just from a utilitarian perspective,
either - it's a personal, powerful, and emotional dream that I've been
forced to give up. If it happens, I'll feel awed and humbled and a bit
sorry about underestimating people. But I currently don't expect it to
happen and don't believe I can make it happen, so that project is on a
targets-of-opportunity basis.

> Making the general public aware of the very
> scary things that are possible as technology advances so quickly, (such as the
> consequences of out-of-control nanotech) will only help us to reach the
> Singularity, ultimately.

If the public is interested, great, that's why SIAI has a public website,
but I'm not going to shove anyone's nose in it. I am certainly not going
to go peddle technophobia. It is not emotionally realistic to expect
someone to accept the end of the world *before* you tell them how to save
it - it has to happen the other way around. Similarly, if you tell
someone about a threat, their emotional reaction is to run like blazes the
other way even if it kills them, not make a 10-degree course change to
take the path of least comparative risk.

"Dangers of technology" propagates selectively, and faster than
pro-technology memes, so I'm not about to use it as a carrier; it makes a
Baylor Jihad more likely.

> The Singularity
> is supposed to help EVERYONE, not just those who got in line first.

And it will, regardless of whether they were SIAI programmers, or people
who never heard of the Singularity, or even people who did hear about it
and decided to let someone else worry about it. Why would a Friendly AI

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT