[sl4] AI's behaving badly

From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Tue Dec 02 2008 - 13:00:05 MST

> @Stuart, what about "Above efficiency and all other goals, put first the
> survival and overall health of the species homo sapiens and its direct
> biological descendants [in the case of speciation]. Do not commit any action
> that may conflict with this goal."

Too easy :-)

"Folded hands" is far too mild a dystopia for what can go wrong here.
First note that you have put this goal above all other goals - a very
dangerous step.

First of all, the AI will take control of human reproduction.
Depending on what you mean by "the species and its desendants" it will
either forbid humans from reproduction, or carry out its own mass
artificial reproduction. In all cases, standard human reproduction
will be a dangerous risk for both mother and child, and hence will be

Secondly, the AI will entomb each human in an artificial coffin,
feeding them intraveinously and keeping them safe from any exterior or
self-inflicted damage. If mental health is not included in "health",
then the AI will destroy people's brains and keep their bodies
healthy. If mental health is included, then the AI will probably store
brains and bodies seperately (uploading the brains, if that is allowed
by its definition of health), and keep the brains mentally healthy.

The best way to keep humans mentaly healthy would be through either
constant drugs and/or a lobotomy of their higher brain functions
(drugs being preferred if "lobotomy" is forbiden by the AI's
definition of health). The best efficient method would probably be to
regress everyone to six-mounth old status, suppress the parts of the
brain that refer to boredom, and keep them experiencing the same fuzzy
and warm experience - moving colours, heat, a full belly, a mother's
voice - for all eternity.

Why would the AI do that? Because it values health above all else -
above any objections you might have to this behaviour. It will
disregard human orders, human rights, human fondess for variety,
human fondness for occasional pain and challenge, or human programmers
screaming "that's not what we meant" - because it is compelled to by
its code.


PS: If you want to claim that the AI will not behave the way described
above because of your own definition of mental health, please send me
your definition of mental health.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT