Re: Evolution and (human) Morality

From: Keith Henson (hkhenson@rogers.com)
Date: Tue Mar 15 2005 - 20:28:05 MST


At 07:18 PM 14/03/05 +1100, you wrote:
>-----BEGIN PGP SIGNED MESSAGE-----
>Hash: SHA1
>
>I have had discussions here before on this topic - I believe that the
>key to keeping an AI honest is to make sure it has good reasons to
>co-exist peacefully. The best way to do this is to make an AI which is
>social - i.e. that thinks that other entities are important and
>interesting, and which will want to keep them around.

Did you perhaps mean friendly rather than "honest" above?

>I have just finished an essay titled "Can Evolutionary considerations
>account for the origins of human morality?." I don't go so far as to say
>that evolution LEADS to morality, but that humans have evolved to be
>social and moral, in a way that is both a natural extension of their
>animal ancestry and in which morality is a dynamic concept.

If you are going to invoke evolution . . . . if morality is an outcome of
human psychological traits, and those psychological traits are the result
of evolution morality as we know it is an outcome of evolution.

The problem is that the majority of our evolution occurred in the stone
age, where the environment was competing hunter gatherer tribes. There has
been time since the beginning of agriculture for some evolution if there
was heavy selection pressure, but not a lot. (I think genes for working
like beavers to get the crops in for winter might have been selected since
those who failed often starved in the winter.)

The other aspect of morality as an outcome of evolved psychological traits
is that it has to close a feedback selection loop right back to the
genes. Hamilton's inclusive fitness criteria is essential to understanding
why mothers sacrifice for their children and why warriors will fight to the
death for their tribes and why these are considered "moral."

Another thing to consider is that morality is situational. Humans have had
no serious predators for millions of years. Thus when we over taxed the
environment or a weather glitch made resources tight, fighting with
neighbors instead of trading with them became the moral thing to do.

Building a social AI with the conditional psychological traits we seem to
have could result in "interesting times" if the AI sensed a resource tight
future.

Keith Henson

>I thought people here might be interested. It is 2380 words (ish), and
>isn't too heavy a read. The link is
>http://tennessee.id.au/philosophy/darwin-morality.pdf
>
>Cheers,
>- -Tennessee
>-----BEGIN PGP SIGNATURE-----
>Version: GnuPG v1.4.0 (MingW32)
>Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
>
>iD8DBQFCNUjAFp/Peux6TnIRAtZnAJ0TqGiJ1BQPErffuyHx6ZzzLxYMzACfdZvm
>WU4MoXEVNQOo81DowyHDMzQ=
>=CwQN
>-----END PGP SIGNATURE-----



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT