From: David Clark (firstname.lastname@example.org)
Date: Fri Feb 04 2005 - 11:04:04 MST
----- Original Message -----
From: "Phil Goetz" <email@example.com>
Sent: Friday, February 04, 2005 7:37 AM
Subject: Re: Ethics
> Current ethical systems are based on interactions
> between agents with equal rights. This will not work
> when the agents have vastly dissimilar needs,
> intelligences, and inclinations. Imagine trying to
> run a democracy in a population in which the
> standard deviation of IQ was 50 points.
Isn't this the case already? Doesn't our population *already* have a
standard deviation of IQ of 50 points or more? Don't the more intelligent
people already feel totally alienated from the current political climate and
the vast majority of other people? Don't intelligent people already have
vastly dissimilar needs that are *not* currently being fulfilled?
> The smart
> people actually making the decisions would not be
> able to explain their reasons to the populace, and
> would have to use deception and manipulation for
> everything they did.
Deception and manipulation are already being used to get the masses to vote
and support most policies. The only real problem is that the *smart people*
are not the one's making those decisions. The current decision makers are
just bureaucratic robotic drones. It definitely is a problem for the *smart
people* to live with and be judged by, other people who look at them and
their ideas as if they were alien creatures. Current juries are not
intellectually capable of understanding most trials and when it comes to
scientific data, the judges aren't either.
> Democracy would be a joke.
> Democracy would be evolutionarily unstable in
> such an environment; a new power structure would
> have to arise with unequal rights.
Democracy is already a *joke*. I totally agree that our current democracy
is and will become more unstable. I also agree with the need for a new
power structure. I know you are saying that these things will happen *if*
there is a large difference in intelligence but I would say that that is the
> Suppose that we widened the circle to include all
> mammals. We would not be able to build any new
> buildings, because that would be taking habitat
> from other animals. We would have to stop using
> cars, which kill many mammals. We would have to
> become vegans. We would not be able to ride horses,
> or keep dogs as pets, or even use seeing-eye dogs.
> Our nation would immediately be bankrupted by the
> popular vote to spend all our money on a public
> handout of liv-a-snaps. You get the picture.
> That's the sort of society you're talking about.
> The variety of transhumans could be as great as
> the variety of mammals.
I see no need to include animals in our ethical systems. All creatures that
live, take space and resources that could be used to sustain other
creatures. I don't want all the animals and plants on Earth to become
extinct but unless people will enforce a limit on the right of women to have
children then animals will, by necessity, come in second place.
You put the "nail on the head" when you say that "Our nation would
immediately be bankrupted by the popular vote to spend all our money on a
public handout". Where everyone has a single vote to spend money that was
collected *not* equally, you have a system that cannot work in the long run.
Even if we are only talking about policies that don't involve money, most
people have no idea what the correct answer is. In a democracy, there is no
weight given to more or less informed opinions. The only thing that has
saved us so far is that the decisions have been relatively simple ones and
the masses have been willing to be lead down a path they wouldn't have
chosen on their own. This situation will become more and more unstable as
we continue toward the singularity.
Transhumans will not tolerate democracy (for everyone, even if they think it
ok for themselves) and most smart people today aren't satisfied with the
status quo either.
> The notion of "sentience", when closely examined,
> turns out to mean nothing other than "human",
> so it is not a useful qualification.
Totally true. Why would an intelligent silcon based AI need to think or act
like a human to be considered "a self aware independant entity".
-- David Clark
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT