From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu May 29 2003 - 19:34:34 MDT
Eliezer S. Yudkowsky wrote:
> Ben Goertzel wrote:
>
>>
>> Volition is a component of "deep, real human happiness" but I don't think
>> it's the only component.
>>
>> One could construct extreme cases of human minds that were strongly
>> self-determined yet were morally and aesthetically repugnant to all of
>> us...
>
> That's not the question, though; the question is whether we, or a
> Friendly AI, should interfere with such a mind.
Er, to amplify: I was not saying that volition is the only element of
human happiness, but that it should be substituted into the role played by
"human happiness" in utilitarian schemas. Maybe some people don't want to
be happy; or maybe they have things they value higher than happiness. I do.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT