From: H C (firstname.lastname@example.org)
Date: Sun Aug 21 2005 - 22:21:28 MDT
>From: Robin Lee Powell <email@example.com>
>Subject: Re: AI-Box Experiment 3: Carl Shulman, Eliezer Yudkowsky
>Date: Sun, 21 Aug 2005 20:29:31 -0700
>On Mon, Aug 22, 2005 at 03:02:27AM +0000, H C wrote:
> > >From: Carl Shulman <firstname.lastname@example.org>
> > >Reply-To: email@example.com
> > >To: firstname.lastname@example.org
> > >Subject: Re: AI-Box Experiment 3: Carl Shulman, Eliezer Yudkowsky
> > >Date: Sun, 21 Aug 2005 17:04:30 -0400
> > >
> > >I released the AI.
> > And now we are all dead. Thanks a lot...
>The whole *point* of the experiment is to prove that boxing is not a
>sufficient protection against a smart AI. Try to keep up.
Not everybody believes that it is not sufficient protection, hence my
comment implying "Lucky you didn't try this for real because you'd be a
Try to keep up.
>http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/
>Reason #237 To Learn Lojban: "Homonyms: Their Grate!"
>Proud Supporter of the Singularity Institute - http://intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT