From: Stephen Tattum (S.Tattum@dundee.ac.uk)
Date: Sat Oct 29 2005 - 06:26:37 MDT
Ken Woody Long Wrote :-
>An android brain requires an artificial consciousness, defined as a
>post-contemporary computer system that encodes information and
performs
>logical operations on it via a focal point generally named a self. By
>"self" I mean that focalizing agent exposed in the dual sound source
>experiment. The subject had earphones on. On one side was played a
>conversation, on the other was played another. What was revealed is
that
>the subject could only attend to and remember one sound source at a
time.It
>was impossible to attend to both. So the root of consciousness was a
>focalizing agent, or self, that has no option but to switch its
focalizing
>attention to a single source. This self also refers to itself as "I"
and
>has personal memory of this I's unique experiences. This self is the
>focalizing agent of consciousness and so must be included in any
artificial
>consciousness.
Identity and a sense of self is not a necessary prerequisite for an
artificial mind. The idea that it is comes from human inability to see
past our own identities - we feel like focalized agents and can't see
how any intelligence would function without a self too. The experiment
with the sounds shows more about the operations of our auditory system
than it does that the root of consciousness is a self. I find selves
quite a ludicrous concept, 'myself' for example - where is the me that
owns this self - and if there is one then there must be infinitely many
more - and I think 1 self is just as ludicrous as 2 or 75. Selfhood and
identity are just very evolutionarily useful - to the point where every
human can't really get by without one in everyday life. There are people
though who have suffered brain injuries that cause them to lose all
memory to the point where they couldn't tell you their own identity -
these people are clearly still intelligent though, they can talk, play
chess etc. and in some way they are still themselves - they still have
the same mannerisms for example. My point being that identity is just a
story we make up for ourselves - and in order to function in a world
populated by selves an AI would probably not be able to interact the way
a person does if it did not have a self but it isn't as necessary as you
make out - the patients with memory loss are still conscious .
As far as uploading is concerned I think a moravec transfer would be
the only way to preserve the self that feels like you, a complete brain
replacement would feel like death as soon as your brain was removed - of
course afterwards your body and your new brain would act and feel just
like the old one did if the new brain was modeled well enough. Selves
are simply collections of ideas in and around our minds that all point
to an agent behind their actions, but really all that exists is the
ideas and the feelings of being a singular entity. The truth is that if
you were to cut out small chunks of my brain say, and speak to me after
each chunk was removed I wouldn't be that different after each chunk and
different functions would be lost at different chunks - my 'self' would
disintegrate bit by bit also, there is not a point where my self resides
because the feeling of it being a singular thing at all is an emergent
property of the whole brain.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT