From: Mike (mikew12345@cox.net)
Date: Mon Jun 28 2004 - 20:25:44 MDT
> The ai-box experiments
> <http://yudkowsky.net/essays/aibox.html> may > be of
> relevance
> here.
Having given this some thought over the last year or so, I would like to
accept this challenge. In fact, I'll sweeten the pot:
- If Eliezer convinces me to let him out within 2 hours, I'll contribute
$100 to SIAI, to be used for whatever purpose he chooses.
- And if I keep him caged up for 2 hours, I'll accept his $10.
Recognizing some of the risks involved, I have strong reasons for
wanting to keep him in the box, and I'm certain there is no way that I
can be convinced to let him out before the 2 hours are up.
I have only one condition to add to those I've read so far: If I tell
the AI that I want to talk to it for 2 hours before I decide, Eliezer
can't say "the 2 hours have passed". If the test is based on 2 realtime
hours, then I am referring to those same 2 hours. If he can convince me
to change my mind within those 2 hours, that's fair, but he can't just
skip over them. But I fully expect that I can have a 2 hour realtime
conversation with the AI and not have my mind changed.
Mike Williams
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT