Re: Effective(?) AI Jail

From: Jimmy Wales (
Date: Thu Jun 14 2001 - 17:18:10 MDT

I think this is a good question, but I think there are good answers to it.

I was mostly joking about the harsh approach, but only to make a point. Not all
of us are gullible weaklings ready to be tricked into turning the Universe into
oblivion. And the serious point here is that if we're really afraid that the thing
might be that bad, it will understand the dilemma that we face and the fact that we
have to be really careful for our own good.

Is it moral to keep an SI in a box until we are reasonably sure that it isn't going
to destroy us? Yes. If it turns out that it is a sentient being hell-bent on our
destruction, then again it is moral to destroy it.

I'm not Yudkowsky-Friendly, even though I'm friendly.

*              *
*      The Ever Expanding Free Encyclopedia     *

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT