Re: The GLUT and functionalism

From: Lee Corbin (lcorbin@rawbw.com)
Date: Wed Mar 26 2008 - 23:47:00 MDT


Stathis writes

>> But I still think that you'll have to aim higher :-) than V1 here.
>
> You can go up as high as you feel is necessary. You suggested that a
> 1000x1000 patch of the Board that was looked up rather than calculated
> might cause a small deficit in consciousness. If that's not enough
> then imagine that half the Board is looked up and the other half
> calculated: surely that should be noticeable?

Noticeable to whom? Here is what I think is going on: *naturally*,
the subject, the person who's supposed to be having experiences,
can report nothing different than what he would report anyway.
All the states that his brain would have reached (under what I call
"totally authentic computation") would still be reached. So in a
certain sense, he *doesn't* notice any difference.

But in another sense he does. But first, our entire hypothesis depends
on the agencies running the experiment to be perfectly aware of when
his brain states are being looked up, and when they're being computed
by the usual routes of causality and local reductionism.

Now, let me phrase my answer using an experiment so that there is
no mistaking my meaning. Question: would you prefer

(A) to be tortured for an hour in the old-fashioned way

(B) for records of such an hour merely to be retrieved from
       a galaxy far, far away a long time ago in which you were
       tortured just the same, and merely the states for that hour
       interval brough to Earth and at the proper moment merely
       looked up?

I would far, far prefer option B, even though afterwards I
would remember them equally well (and hideously) whichever
I chose. Would you really have the fortitude to say "I'm indifferent
concerning A and B"?

In other words, if half my states over a period T are looked up,
and the other half thereby properly calculated, then the misery
or pleasure I get during T is thereby halved.

>> I think that Chalmers almost by definition can never find what he
>> is looking for, because any explanation would fail to satisfy him
>> either for one reason, or if that doesn't work, then a new one.
>> I'm afraid that an explanation would have to *make* Chalmers
>> feel conscious, or feel an experience.
>>
>> Extremely hypothetical guess: if you took the set of all 26^500
>> explanations of 500 characters in length, not one of them would
>> satisfy those who insist that there is an insoluble mystery to the matter.
>>
>> Lee
>>
>> > http://consc.net/papers/qualia.html
>
> The quoted paper has nothing whatsoever to do with the "hard problem"

Sorry.

> (if that's what you were referring to). It is an argument that, whatever
> consciousness may be, it should be possible to generate it in
> a suitably configured non-biological substrate.

Anyone who's for uploading is clearly already on-board to this point
of view.

> The only (naturalistic) way to avoid this conclusion is if the brain
> contains fundamentally non-computable physics, and there is no
> evidence that it does.

Just so. I would be surprised if this were at all controversial between
you and me, or between us and our favorite correspondents.

Lee



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT