From: Phillip Huggan (email@example.com)
Date: Sat Oct 29 2005 - 11:53:35 MDT
Personal experiences reliably evidence my former assertion. Without LSD, sleep deprivation, or dreams, one cannot seem to experience the mental states of other minds. In all these instances I'm sure encroachment upon subconscious and unconscious mind centers is the cause and not telepathy. Minute or even greater amounts of information loss doesn't make a difference, but a physical disconnect from the original brain to be copied, does.
I don't have a GUT/TOE or a complete Philosophy of Mind. But I do think the key lies in the activity of the synapses, not neurons. The local activities of a synapse are very fragile. If you replace one with a blank slate replica, a tiny amount of identity is lost. 1/(one trillion + value of neurons), in my estimation. This is why a lego upload won't work, and I don't think enough lego sets can build any mind without some radical chemical or electrical superstructure.
Subjective memories can be replicated in an upload. This is not the same thing as preserving identity!! A high enough level of hypnosis could probably have you thinking you were Elvis, but the king would still be dead. The fact my upload could pass a Phillip-Turing test does not console me if I am run over by a train. One million Phil-uploads "living" at present, would not increase the standard-of-living of my mind one bit. The same physics responsible for the Arrow of Time, are utilized by human minds. A digital AI can be losslessly ported, but who says it will be sentient? What is its mechanism for emotion, pain or pleasure? I do not know of a non-analogue way to engineer an endocrine system.
Michael Wilson <firstname.lastname@example.org> wrote:
Phillip Huggan wrote:
> One's personal identity cannot float around to any space-time
> coordinates it pleases.
Why? What fundamental reason for there is this as long as the
informational/computational constraints are met? A digital sentient
AI can be losslessly moved from hardware to hardware and duplicated,
does a minute amount of quantum information loss for analogue
substrates, or none at all if the transfer is incremental in certain
ways, make any difference?
> The abstract thought experiments equating identical patterns with
> identical identities are neat, but fall apart as soon as physics
> is invoked. In both scenarios below, your upload has a separate
> ontological existence.
You have yet to establish either the ontological primitiveness of
consciousness or how it maps onto physics. For example, you have
yet to establish a link between time-space co-ordinates or substrate
in general and 'youness'; if the cognition is similar enough to
produce indistinguishable objective and subjective results, how are
any other factors going to make any difference to the identity of
the conscious entity?
Yahoo! FareChase - Search multiple travel sites in one click.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT