What if qualia are noncomputable and necessary?

From: Emil Gilliam (emil@emilgilliam.com)
Date: Mon Jul 23 2001 - 21:31:40 MDT

What if:

(1) The mind does something noncomputable (in the Turing sense),
thanks to as-yet unknown physics;

(2) This noncomputability lies at the heart of qualia; and

(3) We must explicitly take advantage of this feat of physics in
order to produce an artifact capable of running "real AI"?

Related topics were touched upon a few months back in the thread "The
Singularity and the quantum mind," but the thread was unfortunately
muddled by the confusion of several unrelated issues, and did not
play itself out satisfactorily.

The consensus I seem to find here is that although (1) and (2) are
possible, they are unlikely, and that even if they are true, then (3)
is so unlikely as to not be worth strategizing about. Is this, in
fact, what most Singularitarians here think? *


At this point I must dispose of some red herrings that inevitably
sidetrack most of these discussions:

(A) Because of Penrose et al., much to-do has been made about quantum
computers in this context, but I must point out that quantum
computers are Turing-computable. That is, there is nothing a quantum
computer can do that a classical computer can't do given enough time.
Okay, that's not really true -- a classical computer can't pick a
"random number." But as I understand it, there is no *function* a QC
can compute that a classical computer cannot. Penrose and others
actually propose something noncomputable, beyond the quantum
computer, in whatever it is that the brain does. Ignore quantum
computers unless it's absolutely necessary to bring them up -- what
is postulated is that some arrangement of atoms can cause nature to
produce an noncomputable function that is explicitly taken advantage

(B) Eliezer stated "I do think there's a good possibility that qualia
are noncomputable ... But I definitely deny that the noncomputability
has anything whatsoever to do with either Godelization or reflection.
I *know* how those work." Let's take this statement at face value,
and ignore the Godelization and the reflection. If qualia are
noncomputable, however, is there still no possibility that they would
be needed to build an AI capable of bringing us to a Singularity (or
even weaker "real AIs")?

(C) If (1)-(3) turn out to be true, this does not mean that we can't
create an AI. It does mean we'll have to learn to manufacture some
kind of physical artifact other than a computer. I say "artifact"
because the term "machine" has become loaded in the last 50 years to
mean "Turing machine," rather than its more general, original sense
of "something we can manufacture." Of course, if all of physics is
computable, then these two meanings are equivalent.


There is nothing in the consideration of this possibility that
inherently requires reverting to mysticism or teleology (or
accusations thereof).

There is, of course, the fact that current empirical evidence
pointing to something noncomputable in the [microtubules, or what
have you] is extremely lacking, as Ben Goertzel pointed out. This
argument should be the best reason to reject this hypothesis for now.
If the hypothesis is wrong, we should be able to say that it is
erroneous science, rather than pseudoscience.

However, I shall play the Devil's advocate and ask whether anyone has
a backup plan in case (1)-(3) turn out to be true. It would
definitely push the Singularity back by a decade or two, at the very
least. At what point would we decide that it's probable enough -- or,
pessimistically, that the strong-AI program has gone on for too long
without progress -- that it's worth spending time on this? I suppose
it largely (though not entirely) depends on what happens in physics
in the near future.

- Emil

* Not that I am predisposed to insecurity and group-think, but I am
trying to clarify for myself why previous stabs at this topic ended
up the way they did.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT