From: Christian L. (email@example.com)
Date: Sun Jun 03 2001 - 18:32:11 MDT
Mitchell J Porter wrote:
> > If there is a large enough power differential between a
> > and us, egoism will not imply any sort of mutualism. If it doesn't
> > care about us, if we have nothing to offer it, and if we're in its
> > way, we're toast.
Samantha Atkins wrote:
>If that is all there is to ethics, then we're toast. But I
>don't believe it is. I believe sentient life is to be prized
>regardless of whether a particular sentient or type of sentient
>has something to offer you beyond its existence or not.
Is there a good non-anthropomorphic definition of sentience somewhere?
The reason I ask is that SI:s might indeed agree with the statement above,
but still do away with humanity because they feel that we are not "sentient"
in THEIR definition of the term. Sentience seem to apply a certain level of
complexity within the processing unit (brain in our case): The ability to
recognize the self for instance.
The SI:s might have some particular state of mind that (in their view) sets
them apart from us as our sentience sets us apart from the lower animals.
They might feel a strong sense of bonding with other beings with this
mindstate, but feel that our sentience isn't all that fancy.
If you make an analogy: The SI:s might compare MindState X to our sentience
in the same way we compare our sentience to, say, the ability to adapt to a
changing environment(which many animals have).
>that a real superintelligence versus a computing engine without
>any true inner life will conclude eventually (hopefully sooner
>rather than later) that its own existence and by extension the
>existence of all sentients is made more secure by the valuing of
>sentient beings and by a minimum set of agreed rights that lead
>to a minimum level of cooperation and peaceable co-existence.
Again, replace sentience with MindState X, and it could be true.
>An entity that stomps on humans today out of "having no use for
>them" is open to being stomped on by a more capable entity
>tomorrow that has the same lack of ethical constraints.
This is also interesting. A self-enhancing SI would probably know how much
it could increase his own intelligence. If there was an entity that was as
much above the SI as the SI is above humans, the SI would probably enhance
himself to that level pretty quickly.
Even if the scenario above took place, I cannot see why the Super SI (SSI)
would spare the SI just because the SI was nice to humans. That would imply
that if humans are nice to animals, the SI:s are more likely to be nice to
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT