Re: Animal Consciousness (was Mindless Thought Experiments)

From: M T (nutralivol@yahoo.co.uk)
Date: Fri Mar 07 2008 - 14:30:38 MST


>From: Matt Mahoney <matmahoney@yahoo.com>
>
>It depends on how you want to define consciousness and self awareness. Does
>an animal experience pain if you observe that it learns to avoid actions that
>cause pain? If so, then is autobliss.cpp conscious?
>http://www.mattmahoney.net/autobliss.txt
>
>Some people assert that consciousness requires awareness, specifically,
>episodic memory. For example, a person whose hippocampus was removed would
>forget events immediately after they happened, but could still learn skills.
>If such a person underwent surgery without anesthesia, he would not recall the
>pain. But because he still has procedural memory, he would experience anxiety
>in the place where the torture occurred, without knowing why. If I extended
>autobliss to add episodic memory (it could recall the training sequence, and
>recall instances of recalling it), would it be conscious?

What I hold as main characteristic of consciousness is subjective experience. Self-awareness is not necessary in the sense that a chimp might never think of such matters, yet its subjective experience would be remarkably similar to a human's. Memory also is not entirely necessary since it is a given that memories degrade massively from the moment they are created (flawed as they are, even from that point). Autobliss would not be conscious in either case because of the low amount of complexity involved and the absence of mechanisms to foster subjective experience. Autobliss would fall in the same category as single celled organisms, in regards to my original question: no consciousness that we should take care not to hurt, unless of course we are seriously missing something about the state of reality.

>> It's a practical question for me as it relates directly to being
>> vegetarian but obviously has theoretical implications. Though I will
>> still choose not to eat meat as an exercise in compassion I would like to
>>hear
>> some more educated views on the matter. My (uneducated) view on the matter
>>is
>> that it would be surprising if most insects were conscious because of their
>> simple brains and their great reliance on chemical signals on the blood flow
>> for simple communication of information within the body. The same might hold
>> for some of the smaller mammals. And I would say “no” for the mussels,
>> but wouldn't I love to be surprised...... :)
>
>You are confusing consciousness with ethics. It is in the best interest of
>tribes to practice altruism to other tribe members. Those tribes were more
>successful than the anarchists, so we inherited their genes and culture. This
>doesn't mean such behavior is right, just that we believe it is right. We say
>we do not inflict pain on others because they are conscious like us and we
>would not want them to inflict pain on us. But that is just an excuse to
>justify our beliefs, just like we make excuses to justify killing criminals or
>enemies at war.

I am talking about consciousness in relation to ethics.
I understand how ethics came about, that is not the issue here. Are you content with your "tribal" ethics? Is it part of the natural order of things so we should keep them? I think not. The ethics you are describing are dry and brittle. I understand the repercussions of causing subjective pain to another entity/process/thing/being that has similar consciousness to me, and so I do not do it. It doesn't matter what the other being intends to do [I am not claiming enlightenment, I am still very much human and egotistical. The point is for what ethics one is striving for and holds as better.].

And I understand your point form another post that qualia do not exist and that this universe is devoid of the essential "me" and "you".
But here's where I see the difference: in the future, I want to play a game in a virtual world where I go about killing other humans with a big gun. All in good fun in the same sense that I am doing it now in video games. The AGI in charge of creating the virtual world creates many virtual characters for me to kill. What the AGI is doing, however, is "roleplaying" these characters: they act like being hurt but there's no one there actually feeling the pain. In the same sense characters that I roleplay in a pen and paper RPG are not there really to feel any pain, and also your autobliss is not there and monocellular organisms are not there. Again, unless we are *really* missing something about this reality...

Unless you refute or don't like how I define consciousness, I'd like to hear yours and anyone's view on where the cut-off point for consciousness is.

Michael

      ___________________________________________________________
Rise to the challenge for Sport Relief with Yahoo! For Good

http://uk.promotions.yahoo.com/forgood/



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT