From: Daniel Radetsky (daniel@radray.us)
Date: Tue Aug 02 2005 - 17:03:23 MDT
On Tue, 02 Aug 2005 14:07:50 -0400
"Michael Vassar" <michaelvassar@hotmail.com> wrote:
> It seems to me that historically "impossible" has essentially always
> meant "I can't figure out how to do it right now".
I suspect that you will be much better off if you use more accurate language.
"Impossible" means "in principle cannot be done." A lot of people have used the
term "impossible" in the past. It wasn't that they meant your definition. They
meant mine, but they were wrong.
> Anyway, we aren't really arguing about what can or cannot be done. We
> all agree that an AI with the solar system at its disposal can get out of a
> box. We are playing the Jared Diamond game of arguing about what can be
> done with a particular set of resources.
Yes, this is exactly what we are doing, because it's a lot more important to
boxing than what an AI can do with a solar system.
> Such mistakes never happen in analytically tractable systems like
> tic-tac-toe, but always happen in complex systems, such as any physical
> system capable of implementing a GAI must be.
I can't tell whether you mean you mean this to be an extensional claim with
the extension being your experience, extensional with the extension being the
actual world, or a necessary claim. The first claim may be true, but I don't
think it tells us much about the next two claims. The second claim's truth
value is unknown. The last claim is obviously false.
> I actually think that people proposing AI boxes are a bit like literature
> majors proposing to lock McGuyver in "a room full of discarded electronics
> components".
Well I actually think that Nostradamus and Pauly Shore are the same person.
Also, what the hell do "literature majors" have to do with this argument? Are
they a paradigm case of naive or something?
> Any GAI will have the equipment to produce and detect electromagnetic waves
> of a variety of frequencies, to produce magnetic fields with extremely fine
> precision, to generate extremely focused heat, and probably to manipulate
> mechanical actuators such as those used in the hard drive and cathode ray
> tube (alternatively, a huge field of liquid crystal under fine electronic
> control). It will probably have some ability to reverse all of its input
> devices. It will have a large number of different types of atoms and
> molecules within itself, some of which can probably be used for lasers (in
> most PCs, it will actually have lasers in the CD drive), a power supply, and
> many tools that I have overlooked.
"And these can be used to break out in a way we can't stop as follows..."
Aside from the fact that we can get rid of or restrict a lot of those
components, I don't see why we should believe in the first place that they
would allow the AI to break a box.
Daniel
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:23:00 MST