From: Phil Goetz (firstname.lastname@example.org)
Date: Tue Jan 17 2006 - 21:11:22 MST
--- Damien Broderick <email@example.com> wrote:
> At 07:06 PM 1/17/2006 -0800, Phil wrote:
> > > >Woody has not proposed
> > > >any test that can be carried out by a human.
> > >
> > > Has in fact proposed (for a profoundly half-arsed value of
> > > "proposed") a
> > > test that specifically and by design *can't* be carried out by a
> > > human.
> >I didn't mean the test can't be taken by a human
> I did, and said so. Searle designed his Chinese Room thus, as an
> reductio ad absurdum of semantics-free piecemeal emulation.
> Damien Broderick
Searle's Chinese room is not a reductio ad absurdum of semantics-free
emulation. This is proven because, when presented with a situation in
which the Chinese room is embedded within a robot body just like a
human's, responding directly to sensory stimuli, Searle STILL says it
has no consciousness.
Searle has elaborated repeatedly and extensively on the Chinese room
argument in the 25 or so years since he made it.
One of the things he says is that
we need "brain stuff" to produce consciousness, and that the lack of
consciousness in the computer is because it lacks a physical substrate
with specific, but currently unknown, properties (much like Penrose'
Or, in other words, a soul.
My more important point is that Woody's test is untestable. We have no
way to evaluate whether a machine is conscious of the meaning of its
inputs in the same way that a human is.
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT