Re: FAI means no programmer-sensitive AI morality

From: Samantha Atkins (
Date: Sat Jun 29 2002 - 23:38:05 MDT

Ben Goertzel wrote:
>> The vast majority of religious
>>people, especially what we would call "fundamentalists" and those outside
>>the First World, adhere to a correspondence theory of the truth of their
>>religion; when they say something is true, they mean that it is so; that
>>outside reality corresponds to their belief.
> I don't think you have it quite right.
> What they mean is more nearly that *their experience corresponds to their
> beliefs*
> Modern scientific realism is based on accepting outside reality,
> observations of physical reality, as the fundamental determinant of truth.
> On the other hand, many other traditions are based on accepting *inner
> intuitions and experiences* as the fundamental determinants of truth.

It is quite possible to accept both as determinants of truth,
perhaps in different domains or aspects of a reality that is
large enough to include both and more. There is no reason one
must be only in one camp or the other. Or even look at it as
two different and utterly incompatible camps.

These inner intuitions and experiences are by no means just
individual subjectivism with no commonality. The writings and
teachings of mystics across many traditions, cultures and times
show a great deal of similarity below the particular cultural
and relgiously imposed coloration of their experiences. Now, it
is certainly possible this similarity is one of similar brains
reacting similarly under certain practices and tensions.

> It seems to me like you don't fully appreciate what it means for someone to
> have a truly non-rationalist, non-scientific point of view. Probably this
> is because your life-course so far has not led you to spend significantly
> much time with such people. Mine, as it happens, has.

As above, a viewpoint that includes inner intutions and
experiences is not necessarily non-rationalist except for very
narrow definitions of the rational and is not necessarily
non-scientific even if it is not limited to the strictly

> Take traditional Chinese medicine, or yoga, or Zen Buddhism, as examples.
> These are ancient traditions with a lot of depth and detail to them. Their
> validity, such as it is, is primarily *experiential*. It is largely not
> based on things that individuals outside the tradition in question can
> observe in empirical reality. [Yeah, I know people have tried to test for
> enlightenment by studying brain waves and such (lots of work at Maharishi
> University on this), but this isn't what it's all about -- this is icing on
> the cake from the spiritual point of view.]

I am very familiar with yoga and have been involved in it at
some level of practice for some time. I have at least a reading
acquantance with other traditions. I find yoga practices and
philosophy excellent training of the mind and emotions and it
gives me a centering that I cherish. Does that make me less
able to appreciate and participate in science and scientific
dicussions and activities? No.

I have also had various experiences in my life that I cannot put
down to simply wacky brain chemistry or a mis-aligned "God
module". I think explanations are quite possible for these
states that include all of science but are not limited to it. In
my opinion the very building of an SAI and its further
development as well as our own transhuman development will make
it perfectly clear how spirituality/science is not an either-or
problem at all.

> When my wife for instance became interested in Zen, it wasn't because any
> kind of analysis of observations convinced her, it was because some things
> she read in a Zen book resonated with some experiences she'd already had...

Yes. And many perfectly mundane wonderful aspects of life are
also like that.

> Nitpicking about the definition of logic is not the point. In Novamente we
> have a narrow technical definition of "reasoning" as opposed to other
> cognitive processes, and I can see that I've made the error in some posts of
> using this definition in nontechnical discussions, when in ordinary language
> "reasoning" means something broader. Sorry about that.
> But the point at hand is: many folks will be totally unconvinced by anything
> an intelligent, scientifically-minded AGI says -- just as they are
> unconvinced by your and my arguments that God probably didn't really create
> the world 6000 years ago, that there probably isn't really a Heaven into
> which only 144000 people will ever be accepted, etc.

I doubt seriously it will be only "scientifically-minded" for
very long. It is imho simply too narrow a filter to apprehend
all of reality. For sure neither God or anything else created
the world 6000 years ago unless we are in a sim or VR with some
very major pointless contradictions built-in and it started
around then. :-)

> I do know that Zen Buddhism and yoga and Sufi-ist Islam are outside the
> correspondence theory of truth as you describe it, in the sense that they
> define truth more by correspondence with inner experience than by
> correspondence with physical reality.
> Physical reality, according to these traditions, is an illusion. Emotions
> are also illusions. Only a certain kind of crystal-clear inner insight
> (yes, these words don't do it justice...) is to be "trusted" (though in a
> sense it's viewed as having a directness beyond trust/mistrust)..

Yes, "illusion" sort of misses the mark also. One way of
approaching it that is more westernized, modern SF-ish is that
we in fact are within a training/exploration/entertainment VR
that we have gotten so caught up in we think it is the only
reality. Or perhaps we are really transhumans experiencing the
  point of view of pre-transhuman beings and their reaching for
a state beyond their apparent limits through a variety of means.
  Only by learning to somewhat step beyond or outside (words are
difficult for this) the apparent reality can we break the spell
that keeps us limited. Another view to play with is that our
own transcendence requires letting go of what we think defines
our limits and self. We need to connect to wand work from a
deeper core, perhaps project a different "self" we live from and

>>Whether I could convince a
>>rabbi of that in advance is a separate issue, but it does, in
>>fact, happen
>>to be true, and *that's* the important thing from the perspective of
>>safeguarding the integrity of the Singularity, regardless of how it plays
>>out in pre-Singularity politics.
> So the important thing to you, is that the Singularity has integrity
> according to your scientific rationalist belief system. Fine.
> This doesn't mean the Singularity will have integrity according to the
> belief systems of the vast majority of people on the world.

If it is "any good" at all I suspect that it will have integrity
that may not be according to belief systems of almost any
humans. I think this is part of what Eliezer is getting at. A
provincial disagreement between scientific materialism and
various religious/spiritual views may end up being utterly
beside the point. I very much hope so.

> I don't see how this doesn't constitute "imposing your morality on the
> world." In my view, it does. What you're saying is basically that you want
> to ensure the Singularity is good according to your standards, where your
> standards have to do with a kind of rationalistic "integrity" that you (but
> not most others) see as extremely valuable.

I don't see that at all. I see him going out of his way to not
impose his standards on what the outcome will be or even claim
that he knows what the ultimate shape of it will be. He
actually says that it is impossible to impose his standards on
the outcome even if he wanted to if this thing is a real AGI.
That looks right to me.

> The AGI that I create is going to have a bias toward rationality and toward
> empiricism, because these are my values and those of the rest of the
> Novamente team. Not an *absolutely bias*, but a bias. When it's young, I'm
> going to teach it scientific knowledge *as probable though not definite
> truth*, and I'm going to show it the Koran as an example of an intersting
> human belief system.
> Individuals who believe the scientific perspective is fundamentally wrong,
> might be offended by this, but that's just life.... I am not going to teach
> Novababy that the Koran and Torah and Vedas are just as valid as science,
> just in order to please others with these other belief systems. Of course,
> I will also teach Novababy to think for itself, and once it becomes smarter
> than me (or maybe before) it will come to its own conclusions, directed by
> the initial conditions I've given it, but not constrained by them in any
> absolute sense.

I certainly don't belief it is fundamentally wrong. Science is
perfectly valid and wonderful within its built-in applicable
domain and limits. I simply don't believe its boundaries are
the end of the real. Starting an SAI within those boundaries in
no way keeps it from going beyond them if it finds it needful to
do so.

Sometimes, I look at SAI as a form of externalized jnana yoga.
Science will be used to transcended views limited to only
science - going fully into it in this way will transcend it and
much else of the relgion/science and other current human
tensions and reactive cultural knots. Time will tell if I am
correct in this expectation.

- samantha

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT