Re: Sentience

From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Wed Jun 02 2004 - 08:33:06 MDT


Ben Goertzel wrote:

> Eliezer:
>
> You say that you don't have any idea what I mean by "sentience."
>
> If you check out dictionary.com or any other dictionary you will get the
> basic idea. I'm not being flippant. As inspection of any dictionary
> indicates, the concept wraps up two different ideas, which I feel belong
> together, but you may feel do not belong together.
>
> One core meaning of sentience is:
>
> 1) "Being an intelligent system that contains components devoted to
> modeling and understanding itself and its relationship to its
> environment, and determining its own overall actions."

(1) holds of a Really Powerful Optimization Process.

> Another core meaning of sentience is:
>
> 2) "Possessing conscious awareness"
>
> Note that 2 may be made empirical by changing it into:
>
> 2') "Reports experiencing conscious awareness"
>
> I separate 2 and 2' in order to dissociate the issue of the coherence of
> the notion of sentience, from the issue of the "reality" of
> consciousness. If you are not comfortable with the notion of
> consciousness, you should still be comfortable with 2', the notion of
> reported consciousness.

I would not have 2 (nor 2') hold of an RPOP if possible, and I must solve
this problem in any case, because 2 must absolutely not hold of any
hypotheses the RPOP employs to model human behavior.

> According to my own philosophy of mind, I think any system that is
> "sentient" in sense 1 is almost surely going to be sentient in senses 2
> and 2'.

A *philosophy* of mind? Of what conceivable use is a *philosophy* of mind?
  Do electrical engineers have a philosophy of electricity? They may have
philosophies *about* electricity, its beauty perhaps, but not philosophies
*of* electricity, for they know of what they speak, and therefore they have
no reason to make stuff up at random like they are Greek philosophers. If
one does not know, one should not invent philosophy, for it is
anti-knowledge and worse than useless; philosophy makes ignorance into an
invincible obstacle, rather than a clean slate to write upon.

> On the other hand, if you believe that there are probably going to be
> systems that are intelligent, self-modeling, self-understanding and
> self-determining yet do NOT report having the experience of
> consciousness, then naturally you're going to find the concept of
> sentience to be ill-defined, because you don't consider the two meanings
> 1 and 2/2' to naturally adhere together.

Yep.

> I think that, although you may disagree with it, you should have no
> problem understanding the hypothesis that
>
> 1 ==> 2
>
> in the above definitions. Of course, this is a conceptual rather than
> fully scientifically/mathematically specified statement.

I understand you, and I just hope that everyone understands that I am
speaking of sentience in the sense of 2, not 1.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT