From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Wed Jun 02 2004 - 12:46:25 MDT
Philip Sutton wrote:
>
> It can attract followers who are less thoughtful because they like
> basking in the brilliance of their guru.
Oh, c'mon, you can't possibly believe I'm doing this to attract followers.
Quite the reverse. Now that there's three people in the Singularity
Institute, it's time for people to get used to the fact that I AM NOT
PERFECT. One hell of a badass rationalist, yes, but not perfect in other
ways. I'm just here to handle the mad science part of the job. It may
even be that I'm not very nice. Altruistic towards humans in general, yes,
but with a strong tendency to think that any given human would be of
greater worth to the human species if they were hung off a balloon as
ballast. So frickin' what? I'm not SIAI's PR guy, and under the new
edition of FAI theory, I don't have to be perfect. Everyone get used to
the fact that I'm not perfect, including people who have long thought I am
not perfect, and feel a strong need to inform me of this fact. I'm not
perfect, and it doesn't matter, because there are other people in SIAI than
me, and I'm *not* a guru. Just a mad scientist working happily away in the
basement, who can say what he likes. Believe it, and it will be true.
> It can attract followers who are not as brilliant but who like being
> arrogant too and take the cue from their arrogant guru to behave
> similarly, even though they are less justified in having such a good
> option of themselves.
True.
> In my experience, clever people are not always clever *all* the time
> and are not always right *all* the time.
Then I shall aspire to be cleverer than those not-very-clever people whom
you experienced. And cleverer than my past self, who was foolish enough to
make mistakes.
> So it is very useful for clever
> people to be mindful of that and open to other people's ideas and
> reactions which may alert the brilliant ones to errors, gaps, and
> occasionally less than brilliant ideas. I think it is hard to be in mindful
> mode and over-weening arrogance mode at the same time - even for
> clever people who can multitask.
I think you're flat wrong. Arrogance and humility are not opposites. They
are separate human capacities.
> So if getting the science as good as it can be is a priority and doing it
> fast is important then some moderation in arrogance might be helpful.
Nah. I ain't buying this.
> And also an organisation that is dominated by somebody who is
> uncontrollably and severely arrogant is likely to develop disfunctional
> tendencies that can cause it to perform suboptimally and thus
> jeopardise its mission.
Actually, I've been watching SIAI's dynamics and I don't think I'm the Sole
Guy any more. Tyler Emerson and Michael Anissimov are going to scream at
me about this, and I'm doing it anyway because I'm guessing that public
relations simply don't work the way people think. But in any case, I'm not
going to be "uncontrollably and severely arrogant" like people who build
huge successful business empires. I'm just going to be an impish, bad-boy
mad scientist. That's my new public image and I'm sticking to it.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:38 MST