Re: Effective(?) AI Jail

From: Jimmy Wales (
Date: Thu Jun 14 2001 - 14:29:21 MDT

I wrote something similar to someone else in private email. I'm trying not to
send too many messages to the list.

O.k., the SI is in a box. We aren't sure if it is F or UF. Eli's done his best
to design it to be Friendly, but we can't be sure that he didn't make a mistake.

I'm going in the box. For 30 minutes. I don't have a key to the box, mind you.
But I'm going in there with some instructions from Eli as to what I should ask

Aaron McBride has raised the interesting question of what I might expect.

I say (type):
"Look here, you dumb m------f----r, I'm gonna chat with you. I've a .45 caliber
Glock strapped to my hip, and if you say the slightest little thing about getting
out of this here box, I'm going to put all 11 rounds straight through your CPU,
and then I'm going to make toaster ovens out of your remains, got that?"

And then, how might McBride's ideas work against such a strategy?

Aaron McBride wrote:
> Play dead.
> This could seriously take the guard down of the human(s). They may be
> think that the program isn't working, and that they'll need to give it more
> input/access to the real world to get it running.

Might work... the SI would just sit silently, refusing to respond to me. It might
refuse to respond until someone less aggressive and more gullible came in the door.
(But, through a VT100, it isn't going to be able to tell the difference between me
and someone nicer.)

> Promise the world.
> Try to convince the human gate keeper that if s/he lets the SI out,
> then it will fulfill all of his/her dreams (immortally, power, etc...)


> Threats.
> Basically say "If you don't let me out right now, when I finally do
> get out, I'm going to make your life a living hell."


> Direct manipulation of the "real" world.
> If we do live in a universe that can be controlled/manipulated by
> thoughts (like in The Matrix), then an SI could figure this out, and figure
> out how to manipulate it let itself out.

Might work. Seems very unlikely, though. If that's true, then the whole concept
of the box is flawed from the outset.

> What strategies would be most likely to work on the general
> population? What about SL4ers?

I think it'd be pretty hard (impossible) to manipulate a serious SL4er in 30
minutes time. Particularly one with a gun.


*              *
*      The Ever Expanding Free Encyclopedia     *

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT