From: Eliezer S. Yudkowsky (email@example.com)
Date: Sun Sep 15 2002 - 19:20:20 MDT
Incidentally, this should be common sense, but it deserves to be said
This is a public forum which will be publicly archived.
Do not propose, in this public forum, any security measure which is
weakened if the brainwasher anticipates it; discuss it privately using
strong cryptography, if at all.
Do not propose, in this public forum, any method of attack which you think
at least one brainwasher might *not* think of independently; bearing in
mind that some people who would like to brainwash AI may well be stupid.
Bear in mind that sometimes, talking as if you expect people to do immoral
things increases the possibility that they will do so.
Finally, try not to contribute to creating a *psychology* (which may
spread beyond this mailing list) under which AI is a valuable thing that
people can steal and use, as opposed to an independent sentient entity
capable of making its own decisions. Don't use the word *steal*. Use the
word *brainwash* or *pervert*. First, it is more accurate in terms of
visualizing what the attacker has to actually do in order to succeed. And
second, this planet contains many more people who will perk up when they
hear about the prospect of "stealing something valuable", than it does
people who will perk up at the far more difficult prospect of
"brainwashing a nonhuman mind". There are more experienced thieves on
this planet than experienced criminal AI psychologists. If the aliens
land tomorrow, more people are likely to say "Let's steal their ship!"
than "Let's steal one of the aliens! We'll study its brain, figure out
how to modify its psychology from the outside, and then use it to perform
cognitive tasks on which it has better performance than humans."
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT