AI-Box Experiment 2: Yudkowsky and McFadzean

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jul 04 2002 - 12:05:44 MDT


David McFadzean has volunteered to be the programmer in an AI-Box
experiment. David McFadzean is "Lucifer".

 From the #sl4 chat:

<Lucifer> I remember when Nathan Russell let you out, but that doesn't
constitute a proof.
<Lucifer> Have you done any box experiments since then?
<Lucifer> I want to be next.
  * Eliezer says to Lucifer: "It doesn't work if you think there's
actually a chance you could be persuaded to let someone out."
<Lucifer> I don't think there is a chance I could be persuaded.
  * Eliezer says to Lucifer: "It's only impressive if you can find
someone who'll say "NOTHING on EARTH can persuade me to let you out",
then get them to let you out."
  * Eliezer says to Lucifer: "Oh... you don't think so?"
<Lucifer> I'm willing to bet cold hard paypal cash.
  * Gordon thinks Lucifer is just trying to convince Eliezer to go for it
<Lucifer> I am curious, but I'm still convinced there is nothing you
could say to convince me to let you out of the box.
  * Eliezer says to Lucifer: "And you do understand that, under the
terms of the arrangement, whether you let me out or not, you can never
tell anyone what went on in there?"
  * Lucifer understands.
  * Eliezer says, "Well... okay, if you honestly think that nothing I
can possibly say (as a transhuman AI) can convince you (as the
experimenter) to let me out, then I'm willing to take you up on it."
<Lucifer> Cool
  * Eliezer says to Lucifer: "Actually, one more question though: Do
you believe that a transhuman AI couldn't persuade you to let it out?"
  * Lucifer gives the question some serious thought...
<Lucifer> I can't imagine anything even a transhuman AI could say to get
me to let it out.
  * Eliezer says to Lucifer: "Okay, *now* we have a bet..."

***

<Lucifer> I'm ready, what are the ground rules?
[Eliezer] We open up an IRC channel called #aibox2, which is +s and +i.
  You can go first so you can be op. The experiment lasts 2 hours or
until you decide to let me out.
[Eliezer] If you haven't let me out by the end of the experiment, I'll
paypal you, say, $20 - that sound about right?
<Lucifer> sounds good
[Eliezer] Also, are you subscribed to SL4?
<Lucifer> yes
[Eliezer] Do you want to send an email describing the experiment, or
should I?
<Lucifer> you can go ahead

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT