From: Martin Striz (mstriz@gmail.com)
Date: Fri Oct 28 2005 - 13:35:35 MDT
On 10/28/05, Woody Long <ironanchorpress@earthlink.net> wrote:
>
> > [Original Message]
> > From: Martin Striz <mstriz@gmail.com>
> >
> If legos could be used to build a machine that encodes information and
> performs logical operations on it, at a sufficient level of complexity,
> that lego contraption would
> > be conscious.
> >
>
> I think before we can name it a post-contemporary conscious computer system
> it needs more. Consciousness requires a focal point generally named a self.
> By "self" I mean that focalizing agent exposed in the dual sound source
> experiment. The subject had earphones on. On one side was played a
> conversation, on the other was played another. What was revealed is that
> the subject could only attend to and remember one sound source at a time.
> It was impossible to attend to both. So the root of consciousness was a
> focalizing agent, or self, that has no option but to switch its focalizing
> attention to a single source. This self also refers to itself as "I" and
> has personal memory of of this I's unique experiences. This self is the
> focalizing agent of consciousness and so must be included in any artificial
> consciousness. Consciousness is also intertwined with and propagated by raw
> sense data (sensor signals), as well as emotional and motivational content
> that create the same interactive effect. So artificial consciousness
> requires all this, and is it not true that to create artificial
> consciosness in a robot is to create artificial life itself?
Yep, I left that part out. If you read the paper that I reference,
the requirements for consciousness are information *integration* (into
the so-called seamless global experience). When I said "at a
sufficient level of complexity" I meant "at a sufficient level of
integration."
The theory holds up to empirical scrutiny, because the brain regions
that we know contribute to consciousness (the thalamo-coritcol system)
have precisely the integrative organization necessary (and predicted
by the theory). Other brain regions that don't contribute directly to
consciousness, such as the cerebellum, have more functionally
independent organization.
If you split a conscious system (like the brain) into two independent
non-interacting parts, you get two new systems with a lower level
consciousness than the original because there's less integration of
information.
My point earlier was that this is substrate independent, and even
legos (which don't create emergent quantum effects, I presume) could
get the job done.
Martin
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:23:18 MST