Off topic, of course

From: aominux (aominux@yahoo.com)
Date: Sun Dec 09 2001 - 13:07:54 MST


27 new messages?! What the hell is wrong with you people!

Sorry I'm in a bad mood. Please.. just try to convey your ideas in a lesser
number of emails...

----- Original Message -----
From: "Eliezer S. Yudkowsky" <sentience@pobox.com>
To: <sl4@sysopmind.com>
Sent: Sunday, December 09, 2001 2:55 PM
Subject: Re: The inevitability of death, or the death of inevitability?

> Ben Goertzel wrote:
> >
> > Eli wrote:
> > > > I'm not accusing you of that. I'm merely pointing out that
> > > it's very hard for humans to
> > > > accurately assess their priors and reason about things in a
> > > manner divorced from sentiment.
> > >
> > > I agree, it's very hard. But you have provided no evidence supporting
the
> > > assertion that I am guilty of this flaw in any of the specific cases
in
> > > question; *first* you must establish this, *then* you may sigh over
the
> > > fallibility of humans in general and me in particular.
> >
> > OK, so it's "psychoanalyze Eli" time is it? What could be more
exciting,
> > and more relevant to the future of AI and the universe? ;)
>
> Our annual imports and exports of dental floss? No, just kidding. I'm a
> volunteer; I have no right to complain.
>
> > I would propose that our friend Eliezer has a strong personal desire to
be
> > useful and helpful, on a grand scale.
>
> And this is bad... because of why?
>
> It's true that up to, say, age fourteen or so, I had an emotional desire
> to be helpful on a grand scale; this was later rationalized (that's in the
> sense of "made normative", not "irrationally justified") into a desire to
> accomplish the largest possible amount of good through my actions.
> Practically the definition of a goal system, really.
>
> I want to accomplish as much good as possible with my life. The grand
> scale happens to be available, so I go for that.
>
> In emotional terms, you might phrase the transition as follows, from:
>
> "I want to do a huge amount of good"
> to
> "I have the capability to do a huge amount of good, so of course I want to
> do it; who wouldn't?"
>
> > (I won't go so far as to call it a
> > "savior complex" ;). I would propose that this desire is biasing (in a
+
> > direction) his estimate of the probability that it's possible for *any*
> > human being to be useful and helpful in terms of the Singularity and
> > associated events.
>
> One should be aware that there are psychological forces pushing in the
> opposite direction as well. There are psychological forces pushing for
> passivity and refusal of responsibility - if the Singularity is beyond
> your ability to affect, you don't have to do anything about it. There are
> social forces pushing toward modesty. There is a prevailing memetic
> environment designed to provide consolation for insignificance instead of
> motivation to attempt significance.
>
> The truth is that whatever we as humans may be, the last thing we are is
> insignificant. Humanity's entire future hinges on us, and this future
> contains an unimaginable number of sentient beings. Simple division says
> that if the future contains at least six quintillion beings, then each
> living human carries the responsibility for at least a billion lives. The
> moral weight flowing through pre-Singularity Earth is so heavy that it
> doesn't even begin to diminish when divided by a paltry number like six
> billion.
>
> I can only assume that you are measuring 'usefulness' as a percentage of
> the Singularity, rather than weighing it in absolute terms. Very well.
> There's no point in blowing our fuses by trying to measure all moral
> quantities in units of trillions. Nonetheless, I can still hardly be
> accused of irrationality for focusing on the Singularity as opposed to
> other things.
>
> > But, the thing is, our probability estimates as regarding such things
are
> > *inevitably* pretty damn shaky anyway, and hence very highly susceptible
to
> > bias.... Where do you draw the line between bias and intuition?
>
> In normative terms, the line is very definite. If your desires influence
> your conclusions, that's bias. If not, that's intuition.
>
> The processes by which desires influence conclusions (in humans) is not
> invisible. They have distinct mental 'feels'. It's just that these
> feelings usually goes unrecognized.
>
> > Where there is very little data, we have to use sentiment to guide our
> > intuitions. There's not enough data to proceed by pure conscious
reason.
>
> What's wrong with trying to remain calm and letting our intuitions run on
> their own rails?
>
> > [By the way, in my view, "sentiment" may be analyzed as "a lot of little
> > tiny unconfident reasoning steps merged together, many of them analogies
> > based on personal experience". so you can say it's a kind of reason,
but
> > it's different than "conscious reason", which is based on a smaller
number
> > of reasoning steps, most of which are fairly confident, and each of
which
> > can be scrutinized extensively.]
>
> I rarely have the experience of having an intuition, or any subjective
> feeling, without being able to figure out where it comes from. I guess I
> basically see intuition as an extension of rationality by other means.
>
> -- -- -- -- --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence

_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT