From: Richard Loosemore (rpwl@lightlink.com)
Date: Fri Apr 28 2006 - 13:56:31 MDT
Woody Long wrote:
> Without reference to the analogy, perhaps you would be interested in a
> phone conversation I had with Professor Searle yesterday. Here was the
> heart of it -
>
> WL: A syntactical machine can never be conscious. Correct?
>
> PS: Yes.
>
> WL: And semantical machines that have a semantic understanding of their I/O
> are conscious. Correct?
>
> PS: Yes. And that's the question. How do you get this semantic
> understanding.
>
> WL: Exactly! That's the key point. And my prototype of my invention is
> doing just that.
>
> PS: Then send it to me. (*In a stern tone that implied 'I will believe it
> when I see it.'*)
Strangely enough, I was in an elevator with Searle a couple of weeks
ago, in Tucson.
I told him he should come to my poster at the (Consciousness) conference
were attending, because I was going to present a complete solution of
the hard problem of consciousness, and it would be a computational
solution. He grumped that he didn't know what the problem was, and then
tried to detour away from me (we had left the elevator by that point)
and nearly collided with a table in his haste to escape.
And he never turned up to my poster! Poor guy was probably too timid.
:-) :-) :-)
Richard Loosemore
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT