From: justin corwin (firstname.lastname@example.org)
Date: Wed Jul 13 2005 - 15:11:00 MDT
On 7/13/05, Eliezer S. Yudkowsky <email@example.com> wrote:
> Even if you nuke the entire world back to the seventeenth century and the UFAI
> survives on one Pentium II running on a diesel generator you're still screwed.
You are absolutely right. But if I nuke the whole world and get it, or
if it runs out of diesel, or if it's too stupid to follow the
civilization reboot because it's running on low hardware, or if we
implement locally secured computers next time, then we aren't.
Possibilities are not inevitibilities, and the possibility that the AI
might kill you doesn't mean you can't do anything.
I recognize you're trying to focus AI survival strategy on the top
level case, but other cases exist, and establishing what they are is
potentially important. (And an internet off switch would make me feel
safer too(from AIs at least)).
(META: I've posted too much today, and I'm going to wait until
tomorrow to make any further replies, if any)
-- Justin Corwin firstname.lastname@example.org http://outlawpoet.blogspot.com http://www.adaptiveai.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT