From: Samantha Atkins (email@example.com)
Date: Mon Jun 17 2002 - 03:31:29 MDT
Gordon Worley wrote:
> On Saturday, June 15, 2002, at 07:38 PM, Samantha Atkins wrote:
>> How can or should one be "indifferent" to one's species and to the
>> survival of all existing higher sentients on this planet? If one is
>> for increasing intelligence (how one defines that and why it is the
>> only or most primary value are good questions) and the increase of
>> sentience, I fail to see how one can be cavalier about the destruction
>> of all currently known sentients. How can one stand for intelligence
>> and yet not care about billions of intelligent beings that already exist?
> First off, attachment to humanity is a bias that prevents rational
Rational? By what measure? How is attachment to the well-being
of ourselves and all like us irrational? Whether we transform
or simply cease to exist seems to me to be a perfectly rational
thing to be a bit concerned about. Do you see it otherwise?
>I and others have broken this attachment to keep it from
> clouding our thinking.
So you believe that becoming inhuman and uncaring about the fate
of humanity allows you to think better?
>It is the result of being genetically related to
> the rest of humanity, where the death of all human genes is a big enough
> problem to cause a person to give up a goal or die to save humanity.
I do not necesarily agree that we can just write it off as a
genetic relatedness issue at all. Whether there is sentient
life and whether it continues, regardless of its form, is of
intense interest to me. That some forms are not genetically
related is not of high relevance to the form of my concern. So
please don't assume that explains it away or makes the issue go
away. It doesn't.
> This kind of thinking, usually, is good because it keeps the average
> person from thinking it's okay to blow up Earth. This same kind of
> thinking, though, can be bad if it makes us overly cautious and Luddites.
If it makes it one whit more likely that the existing sentients
survive on, perhaps in a transformed manner, then it is a very
> Some of us, myself included, see the creation of SI as important enough
> to be more important than humanity's continuation. Human beings, being
How do you come to this conclusion? What makes the SI worth
more than all of humanity? That it can outperform them on some
types of computation? Is computational complexity and speed the
sole measure of whether sentient beings have the right to
continued existence? Can you really give a moral justification
or a rational one for this?
> self aware, do present more of an ethical delima than cows if it turns
> out that you might be forced to sacrifice some of them. I would like to
> see all of humanity make it into a post Singularity existence and I am
> willing to help make this a reality.
How kind of you. However, from the above it seems you see them
as an ethical dilemna greater than that of cows but if your SI,
whatever it turns out really to be, seems to require or decides
the death of one or all of them, then you would have to side
with the SI.
Do I read you correctly? If I do, then why do you hold this
position? If I read you correctly then how can you expect the
majority of human beings, if they really understood you, to
consider you as other than a monster?
My own view of the Singularity and all steps up to it is to
transform this world and all its sentient beings possible to a
much more promising form of existence. How much of the
incomprehnsible post-Singularity possibilities are claimed is
dependent on many things but especially on the wishes of the
individual sentient. They will claim different possibilities as
they are ready. No sentient, no matter how powerful, has the
right to deprive them of existence entirely or of their choices.
I don't believe any less than this gives a decent basis for what
we are attempting to do or ends up with a world/universe that I
would want any part of creating or inhabiting. We have the
power to re-create ourselves, this world and an unknown amount
more. How will we use it? Will we just create an SI and leave
it up to it to do whatever it wishes with us slightly more
problematic than cows beings?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT