From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Sun Apr 29 2001 - 18:29:26 MDT
Chris Cooper wrote (Mar 2001):
> Now maybe I'm just damaged by growing up in a heavily Southern
> Baptist environment, but these discussions of the Singularity
> begin to sound like New Testament/Revelations-type stuff.
"The great Tao flows everywhere
to the left and to the right
All things depend on it to exist,
and it does not abandon them.
To its accomplishments it lays
It loves and nourishes all things,
but does not lord it over them."
-- Lao Tse, tr. Alan Watts
"That is another thing so nice about the Tao; it is not bossy! It loves
and nourishes all things but does not lord it over them. Thus the Tao is
something purely helpful - never coercive.
"In the Judeo-Christian notion of God, one thing which is so rigidly
stressed is obedience to God! The great sins are 'disobedience, rebellion
against God, pride, self-will", etc..."
-- Raymond Smullyan, "The Tao is Silent"
It so happens that I believe that the Sysop Scenario is not influenced by
either Judeo-Christian apocalyptism, nor by Taoistic philosophy. A Sysop
is neither God nor the Tao; a Sysop is a Sysop. Are there concrete
differences? Sure; the Tao may not coerce, but (in Eastern philosophy)
the Tao still does influence all living things, pervasively, in terms of
higher-level patterns and not just the underlying matter, and the Tao does
this without first asking permission. Thus a Sysop is not the Tao.
Nonetheless, to accuse Friendly AI of being contaminated by
Judeo-Christian ideas, or to analogize between the Sysop and God, is to
display Western parochialism. Any culturally sophisticated antagonist
should accuse Friendly AI of having been contaminated by Taoism and Zen
I deny both accusations, of course, and would as soon draw no analogy at
all between FAI and *any* religion. You can judge this by the fact that I
didn't think of this defense for quite some time after hearing the
original accusations of theological contamination. Still, I thought I'd
bring it up, since at the last check a few people were still unconvinced.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT