Re: Threats to the Singularity.

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jun 16 2002 - 20:27:45 MDT


Eugen Leitl wrote:
> On Sun, 16 Jun 2002, Gordon Worley wrote:
>
>
>> First off, attachment to humanity is a bias that prevents rational
>> thought. I and others have broken this attachment to keep it from
>
>
> First off, attachment to your life is a bias that prevents rational
> thought.

First off, let's define "attachment" and how it differs from a rational
decision that something is desirable. I happen to feel that BJKlein has an
irrational fear of death because BJKlein says he cannot conceive of it being
reasonable for anyone, even an altruist, to sacrifice their own life so that
ten others may live. Death is "not an option". This, to me, indicates that
BJKlein is processing death as emotionally unbearable rather than rationally
undesirable. I judge that my death is intrinsically undesirable. But it is
not irrational to make that judgement while simultaneously judging that the
death of ten other people is ten times as intrinsically undesirable.
However, if one's thoughts are motivated by an existential horror of
personal nonexistence rather than a sentient-symmetrical judgement that
death is evil, the rational decision to sacrifice your own life may be an
emotional impossibility. This is why I would call "attachment" irrational -
because it acts as a constraint on the set of possible thoughts, and may
cause you to make judgements that are incompatible with your professed
principles.

If something is rationally undesirable it may still be less undesirable than
some even worse alternative; if something is rationally desirable there may
still be more desirable things.

As with continued life, one becomes "attached" to humanity - as opposed to
being a rational defender of humanity's continued existence, which I judge
to be a great good - at the point where it becomes *inconceivable* that
something more valuable than humanity could exist. It would be a horror if
humanity were exterminated. If two other sentient species were exterminated
that would be twice as horrible. If given a choice between the destruction
of Earth and the destruction of ten other planets on which reside roughly
equivalent sentient species (with roughly equivalent chances at the
Singularity), I would choose the destruction of Earth.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT