From: Gordon Worley (email@example.com)
Date: Sun Jun 16 2002 - 20:24:03 MDT
On Sunday, June 16, 2002, at 09:19 PM, Lee Corbin wrote:
> Gordon writes
>> First off, attachment to humanity is a bias that prevents rational
>> thought. I and others have broken this attachment to keep it from
>> clouding our thinking. It is the result of being genetically related
>> the rest of humanity, where the death of all human genes is a big
>> problem to cause a person to give up a goal or die to save humanity.
> I sense a possible confusion here between loyalty to our
> genes and loyalty to the actual extant 6,000,000,000 living
> humans. I have no loyalty towards the former, and in fact
> strongly hope that we all get uploaded and saved from our
> frail genetic housings.
> I do not agree with Eugen that you should be shot on site or
> permanently imprisoned. If I were on a star ship, however,
> I would take whatever actions necessary to make sure that
> you never got near the control room. I want to live. I also
> want everyone else to live (forever). I'm appalled at your
> suicidal urge to replace us and our experiences with something
> unknown and (from some biased itself point of view) "better".
I certainly have compassion for humans. I really do want humans to be
At the same time, if it turned out that we couldn't upload, I wouldn't
say "okay, that's it, no Singularity or SIs" (this is as much a death
sentence to humans as killing them outright).
I don't see uploading humans as the goal of the Singularity, though. I
see the goal as creating a good successor to humanity; to creating a new
intelligence that is beyond our own. If we can benefit from that, I say
all the better (I wouldn't support Friendly AI if I thought that a good
course of action would be to create an SI with no regard for humans).
> Funny, though, how the longer one lives the more precious
> life seems to become. They draft 17 year olds because they'll
> fearlessly charge machine gun nests. I myself at 19 thought
> that the human race would be better off not existing just
> because, as you wrote, the "world is full of pain and
> suffering". (What I failed to see at the time is that
> it's equally full of joy and gladness, and it's obvious
> now how evolution had to make it so.)
There are plenty of good things in life and in humans. We've done
plenty of good things. I hope that we'll go on to do more good things
and transcend into a new level of intelligence. At the same time, if we
don't ever get anywhere, it doesn't matter. If humans lived in caves
and ate bugs and never did anything new ever, the universe would hardly
>> Some of us, myself included, see the creation of SI as important
>> enough to be more important than humanity's continuation.
I imagine this does sound scary. It seems very hard to say that I want
humans to stay alive, but at the same time say that much as past human
species failed to give way to ours, we may have no choice but to give
way to our successors. I hope that we can transcend with them, but if
we can't then we have to learn to deal with that.
>> It's in the interests of the Singularity and SI goal
>> that I not die, because my death would set back the
>> project quite a bit.
> Hollywood's mad scientists are not just figments of imagination!
I'm not sure where you're going with this. Are you suggesting I'm a mad
scientist? Just making a general comment?
Besides, Eliezer does the laugh much better than I do.
>> Especially everyone else, because if we all die then
>> the goal can't be completed at all.
> I too think it's great when a region of space begins to
> experience, when matter benefits. And, yes, we poor humans
> don't do much benefiting per cubic meter and our plants
> and animals are even more pathetic.
> But at the same time, like I said, for many of us our
> lives and benefit are simply not negotiable.
I'm aware of that, which makes compassion for humans a very good
economic decision, let alone a good ethical one.
-- Gordon Worley `When I use a word,' Humpty Dumpty http://www.rbisland.cx/ said, `it means just what I choose firstname.lastname@example.org it to mean--neither more nor less.' PGP: 0xBBD3B003 --Lewis Carroll
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT