From: Gordon Worley (redbird@rbisland.cx)
Date: Sun Jun 16 2002 - 12:11:42 MDT
On Sunday, June 16, 2002, at 12:58 PM, Eugen Leitl wrote:
> On Sun, 16 Jun 2002, Gordon Worley wrote:
>
>> First off, attachment to humanity is a bias that prevents rational
>> thought. I and others have broken this attachment to keep it from
>
> First off, attachment to your life is a bias that prevents rational
> thought.
Agreed. It's a good one to break. It can be a bit hard to stop being
attached to living, but it is possible. This does not mean that I'm
suicidal, though. If a situation presented itself, however, where the
universe would benefit from my death, I'd go right ahead and die.
>> to the rest of humanity, where the death of all human genes is a big
>> enough problem to cause a person to give up a goal or die to save
>> humanity. This kind of thinking, usually, is good because it keeps
>> the average person from thinking it's okay to blow up Earth. This
>> same kind of thinking, though, can be bad if it makes us overly
>> cautious and Luddites.
>
> You of course realize that if it blows up into your face, you're one of
> the first to go. Does this factor anywhere figure in your equation?
Yes. Let's say I'm responsible for the Seed AI for a time (I hope no
one person is ever bears all responsibility, but this is just
hypothetical). It's in the interests of the Singularity and SI goal
that I not die, because my death would set back the project quite a
bit. Therefore, I'll go out of my way to make sure that I and everyone
else doesn't die. Especially everyone else, because if we all die then
the goal can't be completed at all.
>> Some of us, myself included, see the creation of SI as important enough
>> to be more important than humanity's continuation. Human beings, being
>
> I hope you don't mind, but if you honestly think that, and you have a
> nonzero chance of succeeding in most people's value system you've just
> earned the privilege to be either incarcerated in maximum security for
> life, or killed on sight, whatever comes first. (In case there are
> doubts
> to above, I don't mind executing you in person. Okay?)
I don't really know how to explain the position to you. It's not that I
have an apathy towards human life; I'd like to see human life stay
around in one form or another. At the same time, I see the goal of
creating SI as more important than creating more humans that will be
born into a world full of pain and suffering. Furthermore, in my
opinion, humans will last maybe another 100 years at most if we continue
without a Singularity. Unless something happens to make all of humanity
suddenly rational, humans will kill themselves one way or another when
they build a big enough weapon to do it. Tribal thinking doesn't work
well when the stick gets too big.
We have a chance to move beyond the limits of human thinking and create
something that can go on to do new and interesting things that humans
can't even begin to conceive.
> I'm not overly surprised that such attitudes exist (random noise gave us
> both Einstein and J. Dahmer), and in fact I've heard Moravec being
> accused
> of having said something similiar in person, but I hope you don't mind
> that we reserve a special circle in hell for those with broken
> empathy-circuits.
If I thought I might be going to hell one day, I wouldn't be a very
rational person.
-- Gordon Worley `When I use a word,' Humpty Dumpty http://www.rbisland.cx/ said, `it means just what I choose redbird@rbisland.cx it to mean--neither more nor less.' PGP: 0xBBD3B003 --Lewis Carroll
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT