From: Diego Navarro (firstname.lastname@example.org)
Date: Wed Aug 15 2007 - 09:56:50 MDT
> If we made a super intelligent AI and kept it in a machine with no interface to the outside
> world, we would expect it to escape.
Precisely. The whole "unfigureability" of being inside a simulation
depends critically on the assumption that the simulation-maker is
smarter than us. OTOH, singularitarianism depends critically on the
idea that we can make something that's smarter than us.
A weaker form of "The Matrix" can be seen in "The Truman Show" --
where the intelligence advantage of the simulation-makers is not that
large. Truman eventually figures out he's in a simulation because he
starts noticing the regularities. If we turn out to be smarter than
the simulator-maker we can notice too.
This is connected to the nietzschean idea of killing God in a way. I
should really return to work now though, so I'll just drop the bomb
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT