From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Fri Oct 22 2004 - 11:28:58 MDT
On wta-talk and Extropians, Giulio Prisco recently noted that the Raelian
cult seems to have a surprising amount of transhumanist content, mixed in
with gibberish about flying saucers.
Giulio concluded by asking:
>
> I am sure Rael himself and other top officers never believed in the
> flying-saucers layer so, again, I wonder why it is there when the
> [transhumanist] message would stand on its own.
>
> Then I think that: The Raelians have 60,000 paying members worldwide and
> a lot of money. All transhumanist associations together have perhaps
> 300 paying members. I wonder what conclusions we should make.
I (Eliezer) replied:
>
> You should conclude that... the other 100 cults that tried to get
> started using flying-saucer nonsense didn't make it big, so you never
> heard about them in the media?
There are more people trying to start flying-saucer cults than
transhumanist organizations, so you can't necessarily conclude that the
probability of success for a flying-saucer cult is higher.
Nonetheless, Giulio has a point.
It is embarassing that rationalists have more trouble cooperating than
flying-saucer nuts. I'm glad that here on SL4, at least, the "Donate Today
and Tomorrow" initiative made it nearly an entire day before the inevitable
chorus of naysaying started. The Extropians list was not so lucky.
I draw the lesson that's dangerous to be half a rationalist. If you pick
up some rationalist skills, there are certain other skills you have to
adopt to compensate. If you learn to see the flaws in arguments, you also
have to learn additional skills to make sure you apply the same level of
criticism to the ideas you like as to those you dislike. Otherwise, the
effect of learning to nitpick is to lock you into ideas that make you feel
good, which really isn't what rationality is about.
If you learn not to be certain, you also need to learn to live and act
without certainty, and not demand overwhelming evidence to compel you. A
rationalist knows that not even the theory of evolution is certain (though
I would argue that it is the most strongly confirmed theory in the history
of science). Creationists think that as long as evolution is not certain,
they don't have to believe it. If you cast aside certainty and learn the
art of skepticism, you'd better also cast aside the principle that only
certainty can compel you to give up an idea.
Cultists are like bosons. Transhumanists are like fermions. I am going to
describe, as dispassionately as I can, the difference between Raelianism
and transhumanism that makes transhumanism less effective.
If a high-status Raelian says something, no one can question him; if you
try you're out of the cult. There's a chorus of agreement instead of a
chorus of disagreement. Everyone sees the atmosphere of agreement, and
they feel more confident in the ideas presented. Anyone who doesn't feel
confident keeps their objections to themselves. Anyone who really
disagrees leaves the cult, and then the remaining people are more in
agreement and reinforce each other. Like evaporative cooling; fast-moving
minds are ejected. A counterintuitive observation of researchers who study
cults is that cults often increase in fanaticism following what ought to be
a setback - for example, a relevation that the founder clubbed cute baby
seals or had an affair with Nathaniel Branden. Part of what happens is
that the people with a lingering trace of sanity leave first, and then
they're no longer around to hold the others back, who reinforce each other
still further. When new people come in, they're confronted with
wall-to-wall agreement, and the part of human nature that is vulnerable to
peer pressure concludes that if everyone else agrees on something, it must
be insane to think otherwise. Raising a doubt is met with scorn and other
forms of negative reinforcement, making people less likely to doubt again.
Consensus builds, discord dies, the Raelians move in harmony and lase.
And this also of the Raelians: They are not afraid to be passionate about
their ideas.
Now let's look at transhumanism. SIAI has already received on the order of
20 donations. I wasn't planning to reveal this number until afterward, but
these strange circumstances compel me to do so. Some of the donors
included congratulatory notes saying how effective the essay was, or how it
finally inspired them to get moving, and so on. Here's a sample, quoted
with permission from Jesse Merriman, who donated $111.11: "I decided to
give SIAI a little bit more. One more hundred, one more ten, one more
single, one more dime, and one more penny. All may not be for one, but
this one is trying to be for all."
But none of those donors posted their congratulations and agreement to a
mailing list, not one. As far as any of those donors knew, they were
alone. And now they're looking at people saying scornfully why they
shouldn't have donated! The criticism, the justifications or excuses for
not donating, *that* part is displayed proudly in the open. A newcomer
would see wall-to-wall disagreement.
This is, in its own warped way, just as wrong as what the Raelians are doing.
If a bias toward credulity is wrong, you still can't go right by reversing
the bias. The opposite of a great mistake is nearly always another great
mistake, the correct answer being something else entirely.
Suppose that 20 donors *had* posted their agreement on the SL4 list. You'd
feel pretty uncomfortable joining them, wouldn't you? I know that if I
said something at a talk and twenty people in the audience stood up and
shouted "You're absolutely right!", I'd stand around with my mouth open,
completely at a loss for words. And you'd also be unnerved, right? Much
more unnerved than if you were at a conference and 20 people asked scathing
questions of a speaker. We're as uncomfortable together as the Raelians
are apart.
That's just as wrong, and if we ever want to get anywhere, we'll have to
make a deliberate effort to get over it. It's dangerous to be half a
rationalist. If you master some skills, you have to master others or end
up worse off than before. If you learn to disagree with authority without
shame, you have to learn to agree with authority without shame. If you
challenge conventional ideas proudly, you have to accept conventional ideas
proudly. I know how to deal with disagreement, but hearing people agree
with me makes me feel uncomfortable. This I acknowledge as my failing, and
I accept responsibility for getting over it.
I think we can do, if not quite as well as flying-saucer nuts, maybe
one-tenth as well. But it's going to take a deliberate effort.
Rationalists can hold their own against irrationalists, but not easily.
We have to be strong without being certain, by a deliberate act of will, by
the conscious decision that certainty is not required for strength.
We have to act in unison without being conformist, by an act of will, by
the conscious choice that it makes sense to cooperate even when we aren't
in full agreement.
We have to forsake peer pressure and the instinct to believe what others
are saying, and consciously evaluate the probability that someone else
knows more than we do, taking into account neither personal like nor
personal dislike.
We have to reject the common misconception that the art of finding the
correct answer to questions of fact - the art some call "rationality" -
means cynicism, ironic detachment, and the refusal to feel emotion. Let us
choose our beliefs on the sole basis of correspondence with reality. If
those beliefs call forth our passions, then let us feel!
We have to learn to express our unity as well as our disagreement, speak
our rational agreement along with our rational criticism, show newcomers
*both* sides of the issue, swallow hard and defy our fear of public harmony.
I ask of SIAI's donors: Speak up, and hold your heads high without shame!
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:47 MST