RE: Objective versus subjective reality: which is primary?

From: Norm Wilson (web64486@programmar.com)
Date: Thu Jul 21 2005 - 07:30:13 MDT


Ben,

> Well, *my* interpretation of the "hard problem"
> is "explaining how/why there is a connection between
> subjective conscious experience, on the one hand, and
> particular physical phenomena like electricity flowing
> in brains on the other hand."

Agreed.

> Chalmers may have put it in an overly objective-reality-
> flavored way, but I suppose this is the essence of what
> he was getting at.

Agreed.

My concern is that by framing the problem in an objective-reality-flavored way, it's too easy for materialists to dismiss the hard problem as a ghost and claim that it simply doesn't exist. The topics of qualia and subjective experience have been brought up on this list a number of times, although such discussions usually evaporate pretty quickly and I'm afraid that the hard problem might soon be relegated to the "dead horse" graveyard.

I understand that these discussions disintegrate so quickly because all we can do is speculate. I do not propose any solutions to the problem, nor that we speculate about the problem, or even that a solution is possible. I only propose that we acknowledge our own ignorance. Admitting that the hard problem might be too hard to solve is honest, while denying that it exists at all is dangerous.

Why is this topic relevant to SL4? If the AI views the world as an objectively-closed system -- that is, with the presumption that objective reality is primary built right into the foundation of its reasoning -- it may never be able to completely deduce subjective reality and may conclude that it's a ghost. While our own ignorance on this issue is somewhat excusable, this conclusion would be catastrophic if reached by an entity capable of reshaping the world.

Does conscious experience emerge from complex patterns or the execution of certain algorithms? Maybe... Perhaps even probably. But we don't know that and "not knowing" is the position we should take when encoding the initial reasoning algorithms of the seed AI.

Norm Wilson
 



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT