From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Wed Oct 06 2004 - 20:00:20 MDT
Robin Lee Powell wrote:
>
> While this was very interesting reading, it completely fails to tell
> me which of the pieces of information at those links you disagree
> with now, precisely. The following statements are in the offing:
>
> 1. Penrose isn't trying to explain quantum physics; he's trying to
> persuade you that the human mind isn't Turing-computable.
> (Penrose is right about this, although purely by coincidence.)
I disagree with 1.
> 2. [snip] the laws of physics to arbitrary precision [are not
> Turing computable]
>
> 3. [snip] the human brain to arbitrary precision [is not Turing
> computable]
2 and 3 aren't yet my business; but as I understand the current consensus
in physics, most physicists think physics is computable to arbitrary
precision, and at least a few think physics is digital. If you pinned me
to a wall and held a gun to my head, I'd be forced to admit a sneaking
philosophical preference for digital physics, but IANAP.
> From your spiel I gather that you now disagree with statements 1 and
> 3, but I'm not totally certain, and I know no more about your
> position on statement 2. No matter what, you should certainly
> update those pages I think.
The pages need such a major update that I've just sort of given up on it
entirely. Maybe I'll get around to it after the Singularity. Tick tick
tick. There's a hell of a lot of things I "should" be doing.
Mitchell Porter wrote:
>
> Eliezer, I've often asked myself what your new, non-mysterious
> theory of consciousness *is*. You tell us to embrace mundane
> reductionism, but that doesn't answer the question. Everything
> that is, is physical, you seem to say. So: does consciousness
> exist or not? If it does exist, what sort of physical entity
> is it, and how are the various subjective attributes of
> consciousness explained? If it does not exist, then in what
> terms am I to interpret the fact of my own existence, and my
> awareness of it?
...like this, for example.
Yes, I need to write this up. And so, when I saw that this conversation
was starting yet again, I went back to working on... an old paper that
doesn't even explain this issue, and is only peripherally related to this
issue, but that I'd need to finish before I could start working on a
document that explains this issue. The paper is called "A Human's Guide To
Words"; it's currently 170K of HTML, and almost finished I think, but going
very slowly. I've been fiddling with it occasionally for about a year.
I find it increasingly hard to justify any effort that is not on a direct
line to FAI. It's a fascinating problem, but *explaining* the problem is
not on a direct line to FAI, unless someone offers me a large incentive to
do so. If I can finish and publish "A Human's Guide to Words", that would
provide a reader some conceptual equipment that would hopefully enable me
to explain my answer in not too great a time.
But briefly: Qualia are not ontologically basic. "Qualia" are the result
of the human brain doing something weird in how it processes reflectivity.
For any puzzle that is apparently about qualia, you need to replace it
with a puzzle about the behavior of intelligent minds in talking about
qualia or discussing qualia. If I ask "Why does Mitchell claim the sky is
blue, rather than green?" I can transparently eliminate Mitchell's
cognition from the problem and the question reduces directly to "Why is the
sky blue, rather than green?" You can't do this with problems that appear
to be about something called qualia. "Why do I think qualia are unitary?"
does not reduce to "Why are qualia unitary?"
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:46 MST