From: Tennessee Leeuwenburg (tennessee@tennessee.id.au)
Date: Wed Mar 16 2005 - 02:31:56 MST
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Keith Henson wrote:
| At 07:18 PM 14/03/05 +1100, you wrote:
|
| I have had discussions here before on this topic - I believe that the
| key to keeping an AI honest is to make sure it has good reasons to
| co-exist peacefully. The best way to do this is to make an AI which is
| social - i.e. that thinks that other entities are important and
| interesting, and which will want to keep them around.
|
|
|> Did you perhaps mean friendly rather than "honest" above?
Perhaps it's Australian idiom. To keep 'em honest, or as adopted by the
Australian Democrats "Keep the Bastards Honest". I use the phrase
"keeping it honest" to mean keeping it to whatever goals we set it - to
prevent it from going off-track.
| I have just finished an essay titled "Can Evolutionary considerations
| account for the origins of human morality?." I don't go so far as to say
| that evolution LEADS to morality, but that humans have evolved to be
| social and moral, in a way that is both a natural extension of their
| animal ancestry and in which morality is a dynamic concept.
|
|> If you are going to invoke evolution . . . . if morality is an outcome
|> of human psychological traits, and those psychological traits are the
|> result of evolution morality as we know it is an outcome of evolution.
That's not entirely inaccurate, but is less subtle than the point I make
in my essay.
|> The problem is that the majority of our evolution occurred in the stone
|> age, where the environment was competing hunter gatherer tribes. There
|> has been time since the beginning of agriculture for some evolution if
|> there was heavy selection pressure, but not a lot. (I think genes for
|> working like beavers to get the crops in for winter might have been
|> selected since those who failed often starved in the winter.)
Oh, I think that's VERY open to challenge. And I do talk about
ancient/modern morality in my essay. While human capacity for advanced
mental states, and morality as a faculty, has been static for some time,
the actual moral pressures experienced by humans have changed radically.
I don't think the importance of that can be discounted. But for the
point you make shortly, it is enough.
|> Another thing to consider is that morality is situational. Humans have
|> had no serious predators for millions of years. Thus when we over taxed
|> the environment or a weather glitch made resources tight, fighting with
|> neighbors instead of trading with them became the moral thing to do.
Indeed. Morality is flexible and relative - social pressures these days
are obviously in context of modern life.
|> Building a social AI with the conditional psychological traits we seem
|> to have could result in "interesting times" if the AI sensed a resource
|> tight future.
But, I ask you, could you blame it? I am not sure the list has presented
a list of situations and desired outcomes. For example :
An interesting, conscious, intelligent AI is threatened by a
non-conscious replicating robot swarm. Its' only chance of survival is
to defeat the robot swarm, and to do so it needs the resources in use by
humans. Humans have no chance to beat the robot swarm. Surely the
desired outcome here is that the AI place the continued existence of
intelligent life as its primary goal?
The problem is not where needs conflict, but where wants conflict. Which
is what I'm speaking towards - if you make the life of an AI more
pleasant when humans exist, it will try to keep them around. If you
incorporate an interest in social behaviour, and come up with a form of
AI which views human existence as a win/win outcome, then subject to
sufficient resources, the AI will fight for our survival.
Maybe I'm not expressing myself - maybe I'm not even thinking clearly -
but I do think I'm onto something when I say that we need to find a form
of AI which is not going to fall in the early rounds of evolution. You
need a dynamic system which tends to stability - the kind of organism
which will tend to evolve societies rather than favour isolated existence...
Cheers,
- -T
Cheers,
- -T
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.0 (MingW32)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
iD8DBQFCN/0MFp/Peux6TnIRAmWKAJ9LP/FG4htiTenoK4iPNqfUah3iDwCfVISc
QyjYaq5T0EIuaF6acN3eDxQ=
=G1sE
-----END PGP SIGNATURE-----
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT