From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Mar 05 2001 - 00:17:57 MST
Mitchell Porter wrote:
>
> Current expositions of the Singularity, and of
> related concepts like Friendly AI and seed AI,
> assume a strong-AI philosophy of mind, at least
> when it comes to the AIs themselves. Eliezer makes
> remarks about noncomputability occasionally, but
> he still attributes cognitive states to the AIs,
> and hopes that *they* will solve the philosophical
> problems.
For the record, this is a fair characterization of my position. I do
think there's a good possibility that qualia are noncomputable, on the
grounds that if they weren't, we'd have figured out what the heck they are
already. But I definitely deny that the noncomputability has anything
whatsoever to do with either Godelization or reflection. I *know* how
those work. So we can probably ignore the whole thing until we (that's
the Singularitarian "we") can take apart a neuron and look.
The cognitive structures behind Godel's Theorem aren't all *that*
complicated. Likewise for the halting problem. Sometime in the next
decade, Ben or I will explain the argument from Godel to an AI and let ver
fight it out with Penrose.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT