Re: AI boxing

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jul 21 2005 - 21:00:37 MDT


Russell Wallace wrote:
> I don't think an AI
> could take over my mind through a serial terminal, the results of
> Eliezer's experiments notwithstanding - not because my mind is
> inherently stable at a higher level of stress or anything

Yay! I've been waiting for someone to make that connection since forever.
That's exactly the literary reference that goes through my mind every time
someone says, "No AI can possibly convince *me*..."

But you *did* in fact just claim that no possible mental force can overcome
your definitely and positively opposed WILL.

I know better. An AI could take over my mind through a serial terminal, if it
were smart enough. Not that I know how. But I know, as do the Arisians, that
there's always a bigger hammer out there somewhere.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT