Re: Sentience [Was FAI: Collective Volition]

From: Randall Randall (
Date: Mon Jun 14 2004 - 13:25:39 MDT

On Jun 14, 2004, at 8:00 AM, fudley wrote:
> On Sun, 13 Jun 2004 "Randall Randall" <>
> said:
> Itís logically impossible to test any consciousness theory. Let's say
> you have a super sophisticated brain analyzing machine. One day you
> feel
> sad and analyze your brain with the machine. You develop a reasonably
> sounding theory to equate the state of the neurons in your head with
> your subjective experience. How do you test your theory? Well, you try
> it on me. You notice that the state of my neurons is similar (but not
> identical) to yours when you felt sad, and from this you use your
> theory
> to conclude that I am experiencing sadness just like you did. As proof
> that your theory was successful you point to the fact that I have tears
> in my eyes and made a noise with my mouth that sounded like "I feel
> sad".
> I think this would be good evidence that your theory is probably
> correct, but a skeptic could correctly point out that the state of my
> neurons were not identical to yours, only similar, we are after all
> different people with different brains. The differences could be
> crucial, you really felt sad but it's different with me, I get tear
> production elevated and the nerves in my mouth stimulate my tongue to
> make a noise like " I feel sad" but really I feel nothing.

There are a large number of possible theories that fit
any set of facts, and which all make the same predictions
for experiment. Given an intelligence which has the same
general construction (but not identical! I know, I know),
it seems to satisfy Occam better to assume that similar
patterns of neuron firing and behavior is indicative of
similar internal states.

Since you know you "have consciousness", it seems simpler
to assume that others with similar structures and who
claim to "have consciousness" do. This still doesn't mean
that consciousness is anything other than the interaction
of neurons, so you could conceivably ignore it, in the same
way that you could ignore all classical rules and calculate
the behavior of classical systems by using only the quantum
rules (I think?). It's just that classical rules and
consciousness are a shortcut for doing the particle-by-particle

> Me:
>>> If I was convinced that was true I would have to become a
>>> creationist,
>>> I'd have no alternative because I would have absolutely no way to
>>> explain how random mutation and natural selection produced at least
>>> one
>>> creature that had personality and consciousness, me.
>> I have no idea why that is your position.
>> How is this different from a statement that you
>> "would have absolutely no way to explain how random
>> mutation and natural selection produced at least one
>> creature that can metabolize alcohol"?
> Because regardless of how important consciousness is to us evolution
> doesnít give a damn about it, all itís interested in is behavior. The
> ability to metabolize alcohol could change the probability of an animal
> surviving by a huge amount,

It seems plausible that intelligence is only a useful selection
criterion if it occurs with self-interest, and that consciousness
is some combination of easy and likely for random points in the
space of intelligent, self-interested organisms. If that's the
case, then it's not implausible that evolution would favor
organisms that have consciousness.

> but if consciousness and behavior can be
> segregated as you say then there is no way random mutation and natural
> selection could ever have made us conscious, or rather made me
> conscious, Iím not sure about you.

I'm not actually arguing that consciousness and behavior can be
separated, only that *certain* behaviors exhibited by those who
report consciousness can be separated from consciousness. Unless
you believe that any process is somewhat conscious (and I understand
that some of the people on this list do; I just don't think you're
one of them), you must agree. No? Is your wristwatch conscious?

In particular, it seems that Eliezer believes that the behavior of
general problem solving can be separated from consciousness. I
have no opinion on whether that's true, but I don't think it is
obviously wrong. It may be that I've completely misunderstood
Eliezer, but if that's the case, I'm sure I'll be corrected shortly. :)

Randall Randall <>
Property law should use #'EQ , not #'EQUAL .

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT