Re: physical pain is bad (was Re: Dynamic ethics)

From: Philip Goetz (philgoetz@gmail.com)
Date: Mon Jan 23 2006 - 14:37:19 MST


On 1/23/06, Phillip Huggan <cdnprodigy@yahoo.com> wrote:
> Why must we preserve eco-systems in which mammals suffer if a (cheap)
> solution can be found in which the same population of animals don't suffer?
> Why must we preserve economic systems where people suffer? The common-sense
> I speak of is the very simple fact that physical pain is bad. Are you
> seriously questioning this? It sounds to me like you are suggesting
> evolution-of-predator-species is a superior moral guide to follow than is
> utilitarianism.

There are several problems with what you're saying.

One is that we can't preserve other animals in the absence of pain.
Pain is an important motivator in their lives. Suffering is a consequence
of having the freedom to evolve. You can't put the lions in one simulation
and the gazelles in another - then you are separating each element of
the ecosystem out into its own simulation, and within each simulation,
only one species can be simulated in enough detail to allow
consciousness, or else the other species in that simulation will be
conscious and suffer. There are many other related problems that I
will not enumerate here. Essentially, you can't cheat by simulating
everything and eliminating the parts you don't like, the way 17th-century
deists criticized the Christian God for not doing. You end up with
something unreal whose existence can't be justified.

The more important one - which I already explained in my previous
reply, so I won't explain it again, but I will put in in caps this time -
is that THE LION STANDS IN RELATIONSHIP TO US IN THE SAME
WAY THAT WE STAND IN RELATION TO AN AI. You say we have
the moral authority to put the lion in a fake simulation without asking
or telling it. Hence the AI has the moral authority to put us in the
Matrix, or dispose of us in any way it sees fit.

This is not a recipe for a good singularity. UNLESS WE PROVIDE
A NEW ETHICAL FRAMEWORK PRE-SINGULARITY, humans
will assume that transhumans will operate in just the way Huggins
is proposing, and they will (justifiably) either prevent anyone from
developing transhuman technology (which is EXACTLY what Leon
Kass is doing now, for roughly the same reason that I just gave!),
or they will KILL TRANSHUMANS ON SIGHT.

- Phil



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT