Re: Maximize the renormalized human utility function!

From: Michael Vassar (michaelvassar@hotmail.com)
Date: Fri Aug 11 2006 - 01:20:49 MDT


Err...Martin. Michael was talking about using nanotech to duplicate people,
not time travel or cloning.

>From: "Martin Striz" <mstriz@gmail.com>
>Reply-To: sl4@sl4.org
>To: sl4@sl4.org
>Subject: Re: Maximize the renormalized human utility function!
>Date: Fri, 11 Aug 2006 02:53:29 -0400
>
>On 8/11/06, Michael Anissimov <michaelanissimov@gmail.com> wrote:
>>Tennessee,
>>
>> > Well, only if you completely ignore the effect of the environment of
>>the
>> > individual and all the other consequent effects of that idea.
>>
>>What? If I copy the most benevolent person I know, then they will be
>>bevolent, no matter the environment. Humans are flexible - they
>>adjust. Someone won't automatically turn evil just if they're placed
>>in a slightly different environment.
>
>I bet they would, if, for example, you abused or tortured them,
>especially in childhood. They may not be "evil" but they would be so
>psychologically damaged that they wouldn't be the most benevolent
>person you know anymore. Not even benevolent at all.
>
>Martin



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT