From: Eliezer Yudkowsky (firstname.lastname@example.org)
Date: Wed Jun 02 2004 - 13:41:40 MDT
Ben Goertzel wrote:
>> It may even be that I'm not very nice. Altruistic towards humans in
>> general, yes, but with a strong tendency to think that any given human
>> would be of greater worth to the human species if they were hung off a
>> balloon as ballast. So frickin' what?
> So ... this tendency of yours makes me feel like it would be a fairly
> bad idea to trust you with my own future, or the future of the human
> race in general.
Yeah, okay, see, this is the problem. If that's not the guru thing in
operation, I don't know what is. No, you don't think I'm a guru. But you
would hold me to a guru's standards. I am who I am, and I'm *not* a guru.
The world must be saved, and it is a technical problem that requires mad
scientists. Not a mystical problem that requires a guru. Just a
particularly severe mundane problem, that ordinary mundane reality happened
to throw at us. Altruism toward humans in general is all that is required.
I don't have to be perfect sweetness and light. (Perfectly following the
Way of Bayes is still a job requirement.)
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT