From: Metaqualia (metaqualia@mynichi.com)
Date: Mon Mar 01 2004 - 09:58:14 MST
It seems that morality means different things to different people.
So, for a moment, I invite you to stop thinking about morality with all its
cultural and semantic burden and just think about reality without words.
Pick your favourite victim.
Take North Korean prisoners. Their guilt by association system has thrown in
labor camps entire families, people come out at age 20 and reveal having
spent their entire childhood in a labor camp treated like trash, seeing
their relatives beaten to death.
Now, tell me what is the REAL reason why this shouldn't happen.
Is it because it is a system that won't be accepted?
Is it because it limits growth of patterns in the universe?
If not why is it?
I think there can be personal differences at this stage. The fact that the
average human confronted with senseless cruelty has an empathic response and
usually raises objections does not vary greatly (although there are some
very relevant exceptions). But, the shape this empathic response takes, and
the way it is rationalized, these vary greatly.
To me in the most direct sense being empathic means putting myself in
others' shoes.
It means I can simulate, though not precisely, what people in these
miserable conditions are feeling. I can say qualia stream because that is
the way I have learned to dissociate the material events in the brain and
the actual first person perspective, but let's let go of these things for a
second. The reason labor camps are not fair, is because the way it feels to
be in labor camps is self-explanatorily evil. There is no need to justify
this and whoever wants to argue can try to go to a labor camp. Now, if the
people in labor camps didn't +feel+ anything, who would care? Cut them into
pieces, burn them alive, do whatever! They feel nothing! You could never
been in their shoes because there is nobody in their shoes. But if there is
an equivalent to <self> instanced in those people, then labor camps are
evil.
So my particular response to conditions of agony is
1. simulation of the other's first person experience
2. evaluation of the experience as immediately evil or good
You can argue that (1) is a result of imperfectly deceptive neural
architecture, but I still can find logical ways of justifying needing to do
this (as I did in my previous message) which would be accepted in a purely
logical framework. It is not what I am trying to prove here though. I also
can show that if we make the assumption that a very intelligent AI will have
qualia, then it should get to the same kind of morality and empathic
evaluation. Though I am not trying to prove it here because there are some
fundamental points to be agreed on first.
If we can all agree that 1. and 2. are what brings us to say "labor camps
are not moral" then we are getting somewhere. But if you tell me that your
response is
x. looks like people in the camp are going to rebel any time now
y. free people wouldn't stay in labor camps
z. therefore labor camps are evil
or
j. people in labor camps do not become scientists
k. therefore seed AIs are not built
l. so the production of information patterns is limited
then I don't think we can discuss "morality" since we are using the same
word to mean different things. We can still discuss these things
individually, let's call them 1-2 and x-y-z and j-k-l, but calling them
morality only makes the problem harder since we're arguing about different
things with common elements and quasi-same effects.
mq
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT