Re: Sentience

From: Marc Geddes (
Date: Thu Jun 03 2004 - 01:36:21 MDT

 --- Ben Goertzel <> wrote: >
> Eliezer:
> You say that you don't have any idea what I mean by
> "sentience."
> If you check out or any other
> dictionary you will get the
> basic idea. I'm not being flippant. As inspection
> of any dictionary
> indicates, the concept wraps up two different ideas,
> which I feel belong
> together, but you may feel do not belong together.
> One core meaning of sentience is:
> 1) "Being an intelligent system that contains
> components devoted to
> modeling and understanding itself and its
> relationship to its
> environment, and determining its own overall
> actions."
> Another core meaning of sentience is:
> 2) "Possessing conscious awareness"
> Note that 2 may be made empirical by changing it
> into:
> 2') "Reports experiencing conscious awareness"
> I separate 2 and 2' in order to dissociate the issue
> of the coherence of
> the notion of sentience, from the issue of the
> "reality" of
> consciousness. If you are not comfortable with the
> notion of
> consciousness, you should still be comfortable with
> 2', the notion of
> reported consciousness.
> According to my own philosophy of mind, I think any
> system that is
> "sentient" in sense 1 is almost surely going to be
> sentient in senses 2
> and 2'.
> On the other hand, if you believe that there are
> probably going to be
> systems that are intelligent, self-modeling,
> self-understanding and
> self-determining yet do NOT report having the
> experience of
> consciousness, then naturally you're going to find
> the concept of
> sentience to be ill-defined, because you don't
> consider the two meanings
> 1 and 2/2' to naturally adhere together.
> I think that, although you may disagree with it, you
> should have no
> problem understanding the hypothesis that
> 1 ==> 2
> in the above definitions. Of course, this is a
> conceptual rather than
> fully scientifically/mathematically specified
> statement.
> -- Ben

I do tend to agree with you Ben. My guess would be
that 1 ==> 2. If 1 does imply 2, then Eli's approach
would be in serious trouble. Here's why:

Collective Volition is drawing on the extrapolation of
the volitions of all sentient persons. But if the FAI
establishes that it qualifies as a 'Sentient person',
then the FAI has to include it's own volition in it's
calculations. The calculation of humanities volition
would be distorted or 'perturbated' by the existence
of the FAI itself. Worse the perturbation would
quickly blow up into a fatal infinite regress. Recall
that the extrapolation is looking at volition if the
sentient 'thought faster, thought longer, is more
wise' etc. But the FAI itself would think millions of
times faster than the average human, think longer,
have more intelligence etc. Therefore a
disproportionate amount of weight would have to be
given to it's own cognitive proccesses when
calculating the Collective volition (which includes
it's own volition). In fact for a true super-human
intelligence with cognitive powers rivalling all of
humanity, virtually all of the volition that counts
resides in the FAI itself and hardly any in 'humanity'
as a whole. Thus the 'Collective Volition' actually
reduces to the FAI's own volition with only a small
perturbation from the volition of humanity. Therefore
the term 'Collective Volition' is rendered more or
less meaningless.

"Live Free or Die, Death is not the Worst of Evils."
                                      - Gen. John Stark

"The Universe...or nothing!"

Please visit my web-sites.

Science-Fiction and Fantasy:
Science, A.I, Maths :

Find local movie times and trailers on Yahoo! Movies.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT