Re[2]: Will FAI need simulated empathy to avoid sociopathic tendencies?

From: Cliff Stabbert (
Date: Mon Dec 16 2002 - 20:09:45 MST

Gordon Worley:
GW> Empathy is a complex adaption that evolved over millions of years
GW> in response to situations arising from living in kin groups. In a
GW> sense, empathy is coming to morality backwards (trying different
GW> kinds of behavior until you find one that works well).

Now that I've started to read up a little on Evolutionary Psychology,
I find that I disagree with a certain over-simplification I sense in
statements such as the above (not with the basic notion of EP, which
seems pretty sound). Of course all behaviour is "just" a complex
adaptation, in some sense. You didn't say "just", but I feel it's
implied in your statement and others I've read -- it feels dismissive,
like saying beauty is "just" in your mind.

It's a hindsight thing, and a question of at which point the hindsight
gets applied. Looking back from here it may seem that the only
"purpose" of empathy was survival of the species, or the gene; looking
back from some future date it may seem that morality was its
"purpose"; that feelings of empathy, combined with early reasoning
ability, gave rise to basic morals such as embodied in the golden
rule and concepts such as karma. Or even that this sense of morality
which prompted humans to create a Friendly AI and bring about the
Singularity was the "purpose" of empathy.

Of course ascribing purpose to evolution is to anthropomorphise it, in
a sense; that's hard to avoid in any discussion of evolution, although
not as hard to conceal.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT