Re: friendly ai

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jan 28 2001 - 11:59:23 MST


Ben Goertzel wrote:
>
> > > > And why aren't the "goal of creating
> > > > knowledge" and the "goal of learning new things" subgoals of
> > Friendliness?
> > >
> > > They just aren't subgoals of "friendliness to humans" ... or of
> > > "Friendliness" under any definition of that term that seems
> > natural to me ...
> >
> > It seems *very* natural to me. *I'm* Friendly, or at least I try, and to
> > me there's a very obvious connection between creating knowledge / learning
> > new things and getting to the Singularity. I have the innate human joy in
> > it too, of course, but it's an innate joy that I cherish and cultivate
>
> Hmmm...
>
> Buddhism is a belief system in which compassion (friendliness) is key, but
> learning & generation of knowledge are not considered important

*Cough*Smullyan*cough*.

Yes, Buddhism is a belief system which specifically steps out of its way
to say that all apparent human accomplishments are transitory - though I
do believe that individuals are still supposed to walk the path of
enlightenment - which some believers could take as a commandment to
suppress the innate human joy in learning and generation of knowledge.
There are meme complexes that label-as-shameful survival, reproduction,
wealth accumulation, laughter, eating... just about anything... what of
it?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT