Re: Michael Anissimov's 'Shock Level Analysis'

From: Eliezer S. Yudkowsky (
Date: Wed Jan 16 2002 - 16:54:53 MST

Nat Jones wrote:
> A fun read. Thanks Michael. Anyone on this mailing list fit the following
> description, if only in spirit, of an SL5'er?

What is this strange fascination that the word "SL5" seems to exert over
people? Why do so many people, confronted with SL0 through SL4, feel this
impulse to top it by coming up with SL5? Oh, never mind. Anyway...

To be a Singularitarian, or even just a good-old-fashioned transhumanist,
is to tap into some very powerful ideas. It is an unfortunate but true
fact that humans tend to malfunction in the presence of powerful ideas, so
if you want to incorporate powerful ideas and remain rational, those ideas
have to be balanced by self-awareness and mental discipline. At the core,
all of this is about *intelligence*. Intelligence is more important than
fanaticism. You can only permit yourself that degree of fanaticism that
does not, at your current level of mental discipline, interfere with your
intelligence. An extreme effort in the service of a counterproductive
goal is worse than nothing.

The absolute primacy of rationality and intelligence is what must be
preserved, above all, as Singularitarian ideas begin to reach out beyond
the core audience of scientifically literate aggressive rationalists.
Compromise that in the name of "recruiting" and the reins will simply be
torn out of your hands by the people you tried to recruit. It has
happened before and may happen to us even if we do everything *right*; we
cannot afford to make deliberate compromises.

The powerful ideas have to be coupled to equally powerful drives toward
rationality, or the actual *goal* isn't going to get done. And I do not,
by the way, believe in the model of a few rational people remaining sane
in order to provide the strategic direction for a larger group of insane
fanatics; I am not aware of any historical instance of this model
working. Everything I know about human nature says that the insane
fanatics would drop the rationalists like a rotting moose carcass and
promote the most insane of their number. Rationality has to be for
*everyone*, the whole membership; if you recruit someone that is not a
scientifically literate aggressive rationalist, then you have to make sure
that your literature tells people "We think rationality is good... try and
make yourself more rational." You have to preserve the part of the
message that says: "Unlike other ideas, Singularitarianism doesn't say
it's a sin to be skeptical or to question the leaders." Because while
some people may take this for granted, everyone else may not.

Now I am an extremist, by anyone's standard, and I know it. I make no
excuses about having devoted my life completely to the Singularity; I have
done so, I admit it, I'm proud of it, and I'd do it again in a minute.
The six billion lives that I *know* are at stake, to say nothing of all
the future lives that may someday come to be, outweigh my own life by an
indescribably enormous factor. I do acknowledge that I need to have fun
every now and then in order to stay in mental shape, but this is not the
same as living a "normal" life in which fun is pursued for its own sake.
And it may be that fun pursued for the sake of efficiency is not as much
fun as fun which is pursued for its own sake. It may even be that this
makes me less efficient. *But*, and this is the key point, to pursue fun
for its own sake I'd have to change my actual picture of the world, and
I'm not willing to do that because it would compromise *rationality*. I
will not compromise rationality even if it makes my life more efficient!
If I keep to the rational course, then eventually I expect to discover
some way to have utilitarian fun that is just as healthy as intrinsic
fun. I'm not going to lie to myself, because that would cut off the
possibility of future progress, even if it delivered a small (or a large)
short-term benefit. If you're going to be a good fanatic, then you have
to remember that good fanaticism is just a very intense effort, and effort
has no intrinsic worth. Dedication proves nothing. It has no value on
its own. It is nothing to be proud of. It is useful only insofar as it
achieves the actual goals, and to compromise rationality removes the
possibility of achieving those goals. Now I do think that complete
dedication is, in fact, very useful, and for that reason I am completely
dedicated, but "dedication" is not the point. It's not something to be
proud of for its own sake.

If you're going to be a Singularitarian fanatic, then you have to remember
that the Singularity is more important than your own fanaticism. I expect
this sounds obvious to most of us, but it is not, in fact, obvious. The
vast majority of fanatics talk and act as if fanaticism is inherently a
sign of moral worth. The vast majority of fanatics do not know how to
safely handle this mental plutonium in order to use it as fuel, which is
why extremism, even extremism in altruism, has such a lousy reputation
among rationalists.
Unlike most other aggressive rationalists, I don't think that extremism is
bad, I just think it has to be handled carefully. But if you *don't know*
how to do that, then you really would be better off distrusting your own
passion, even if it means less mental energy and getting less work done.
You can always come back to the problem later, when you have a little more
practice at rationality.

If you're going to be an extremist, you have to be aware of the forces
underlying human nature. And you have to dodge them, successfully, even
at the price of diminishing your own extremism if that's what it takes.

At this time, and from the "Shock Level Analysis" page, I don't think
Michael Anissimov dodged successfully. There are some obvious PR
problems, but PR problems generally go away with sufficient writing
experience. The main thing that I think represents a real problem is the
"us vs. them" mindset - what I usually call the "group polarization"
dynamic, the tribalism instincts from evolutionary psychology.

People are not divided into groups by future shock level. At most, you
can use FSL to divide your readers into audiences. You should not use it
to divide humanity into tribes, place your own tribe at the top, and heap
hatred on the rest. That'd be too... human. Seriously. The original
"Future Shock" essay I wrote contains an explicit admonishment against
dividing people into feuding tribes by FSL, and this is exactly why;
because I know something about how human psychology works, and I know that
if I establish an ascending scale that could theoretically be applied to
people, even if it's meant to be applied only to audiences in a specific
context, then people are going to see a social hierarchy, and they're
going to try and place themselves at the top. (Hence all the attempts to
top the scale with SL5, of course.)

For example, let's take Anissimov's description of SL -1 and SL -2.
There's some pretty darned hateful language in there, which I will avoid
quoting permanently in the SL4 archives, in case Anissimov decides to
change it. The point is: Why hate? What does it accomplish? What good
does it do? It sure doesn't help you outthink an opponent. It is
instinctive to hate, and it may even be politically useful to the
individual to whip up hate against opponents (it worked for McCarthy), but
the point is that it doesn't accomplish anything useful in terms of the
actual *goal*.

I wrote the original Shock Level essay very carefully in order to
establish that I was not establishing a new social hierarchy and putting
myself at the top. That would be stupid, boring, and very human, and is
unworthy of anyone with the slightest smattering of evolutionary
psychology. And now, not to mince words about it, someone has gone out
and done *exactly this* and it looks every bit as awful as I thought it

The Singularity needs more people who are willing to dedicate themselves
completely and to hell with the usual reserve; with this I agree. But you
can't just take that dedication and not take the other things that go
along with it and keep it rational.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT