Re: A formal measure of subjective experience

From: Thomas McCabe (pphysics141@gmail.com)
Date: Sun Mar 09 2008 - 19:26:31 MDT


A great deal needs to be said here, but I'll just hit the high points.

On Sun, Mar 9, 2008 at 4:15 PM, Matt Mahoney <matmahoney@yahoo.com> wrote:
> I propose the following formal measure of subjective experience. The
> experience of an agent observing event X is K(S2|S1) where S1 is the state of
> the agent before observing X, S2 is the state afterwards, and K is Kolmogorov
> complexity. In other words, the subjective experience is measured by the
> length of the shortest program that inputs a description of S1 and outputs a
> description of S2.

"Subjective experience" is an ill-defined concept (see
http://www.overcomingbias.com/2008/03/wrong-questions.html), and we
could argue about it for thousands of years and never get anywhere.
Isn't this exactly what philosophers have been doing, ever since the
days of ancient Greece?

Your definition doesn't seem to be connected to the usual notion of
"subjective experience". Everyone agrees that a five-line Python
program has no real capability for understanding, in the human sense
of the term, but it's easy to write a five-line program with a
positive Kolmogorov complexity.

> Motivation.
>
> Humans are said to have subjective experience, meaning that:
>
> 1. You can recall that X happened (episodic memory).
> 2. After observing X, you assign a higher probability of observing X again in
> the same context (procedural memory).
> 3. If X has nonzero utility, you change your behavior to increase your
> utility, to seek or avoid X (reinforcement learning).

A paperclip tiler, with no general intelligence whatsoever, can say as much.

> All of these result in a change in mental state. On the other hand, if you
> were unconscious or someplace else when X happened, then none of these changes
> would have taken place.
>
> The existence of subjective experience has not been proven.

There's no such thing as proof. See
http://www.overcomingbias.com/2008/01/gray-fallacy.html.

> For example, a
> philosophical zombie is claimed to have no subjective experience but otherwise
> behave identically to a human (e.g. say "ouch" but not feel pain). However a
> rationalist would reject this view because there is no test to distinguish a
> zombie from a human.

This sounds like an eighteenth-century discussion of what air would be
like if you removed the phlogiston.

> Assume humans have subjective experience. This experience must occur in the
> brain. If you put a brain in a vat and stimulated its sensory nerves, it
> would experience a sensation. If you replaced the neurons one at a time with
> equivalent functional devices (Chalmers' fading qualia argument), then there
> would be no change in behavior and presumably no loss of subjective
> experience. If you replaced the brain with any equivalent Turing machine
> (implementing the same function), then it must have subjective experience. If
> we assume further that there are things that don't have subjective experience
> (for example, rocks or dead humans), then subjective experience must be a
> nontrivial property of Turing machines.
>
> Conditional Kolmogorov complexity is therefore one possible measure.

K complexity is hardly a sufficient metric for nontrivial properties
of Turing machines! Consider all Turing machines with n or fewer
states acting on a blank tape. The number of possible Turing machines
increases with C^N, so the number of possible nontrivial properties of
Turing machines increases with C1^C2^N (see
http://www.overcomingbias.com/2008/02/superexp-concep.html). K
complexity, meanwhile, increases with N. The amount of information
conveyable with K goes with log(N); the amount of information needed
as a metric for an arbitrary nontrivial property goes with
log(C1^C2^N) = C^N.

> As
> further motivation, consider experiencing two events, X and Y. This will
> result in more to be recalled (greater complexity with respect to episodic
> memory), greater adjustment of probability (procedural memory) and greater
> change in behavior (reinforcement).

Your Kolmogorov-based definition doesn't account for any of these. A
five-line Python program has no episodic memory, procedural memory or
reinforcement mechanisms.

> Applications.
>
> Some people believe that it is unethical to harm (kill or decrease utility of)
> agents that have subjective experience. I do not take a position on this
> issue, but if we assume it is true, then:

Beware using one Really Great Idea to explain absolutely everything
(http://www.overcomingbias.com/2007/12/affective-death.html). Human
morality is much more complex than this
(http://www.overcomingbias.com/2007/11/thou-art-godsha.html).

> - Teleportation (making a copy of a person and destroying the original) is
> ethical because K(S2|S1) = 0 (no change in mental state).
>
> - Torture followed by reprogramming the brain to erase the memory of it is
> ethical for the same reason.
>
> - Killing animals to save humans is ethical because humans have bigger brains,
> and therefore a greater capacity for subjective experience. (One could argue
> that killing humans for the benefit of superhuman intelligence is ethical for
> the same reason).
>
> - There is no subjective experience of death because K({}|S1) = 0.
>
> However, this ethical view also leads to the conclusion:
>
> - Creating and then killing a person regardless of what happens in between is
> ethical because K({}|{}) = 0. (But one could argue that killing is unethical
> because of its negative utility on survivors).

It's impossible to have negative K complexity.

> A data compression program like zip has subjective experience in all 3 modes
> that humans do. A compressor accepts a sequence of symbols from an unknown
> source and has the task of predicting future symbols so that it can assign
> shorter codes to the most likely symbols. It has procedural memory because
> after each event (observing symbol X in some context), it raises the
> probability that X will occur next time the same context is observed. It has
> episodic memory because decompression recalls the exact sequence of events.
> It undergoes reinforcement learning with a utility function equal to the
> negative of the length of the compressed output.

I haven't studied compression algorithms extensively, but you seem to
be playing fast and loose with the idea of "memory" and "utility
function" here. Compression algorithms, so far as I know, have no
explicit utility functions, and at least one (LZW) has no explicit
representation of probability.

> So even if you accept that
> only one mode of learning is required for subjective experience (say, episodic
> memory), then you have to conclude that if humans have subjective experience
> then so do data compression programs.
>
> The human brain has a capacity for subjective experience K(S2|{}) from birth
> between 10^9 bits (Landauer's estimate of long term memory, see
> http://www.merkle.com/humanMemory.html ), and 10^15 bits (the number of
> synapses).

There's a big distinction between K complexity and bits of memory in
the normal sense. Assuming that the universe is a closed
Turing-computable system, it cannot have a K complexity significantly
higher than the K complexity at the time of the Big Bang.

> A data compression program together with its output could
> therefore have as much subjective experience as an adult human from birth if
> its input is large enough.
>
>
> -- Matt Mahoney, matmahoney@yahoo.com
>

-- 
 - Tom
http://www.acceleratingfuture.com/tom


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT