From: Murphy, Tommy (TommyMurphy@livingstonintl.com)
Date: Tue Jan 22 2008 - 15:47:29 MST
Given that morality arises from/changes with intelligence, I fail to see
how you can predict your own post-singularity morality.
I certainly couldn't have pictured myself writing that last sentence if
you'd asked me when I was 3.
-----Original Message-----
From: owner-sl4@sl4.org [mailto:owner-sl4@sl4.org] On Behalf Of Robin
Lee Powell
Sent: January 22, 2008 5:16 PM
To: sl4@sl4.org
Subject: Re: When something impossible happens
On Tue, Jan 22, 2008 at 04:59:08PM -0500, Randall Randall wrote:
> On Jan 22, 2008, at 3:40 PM, Robin Lee Powell wrote:
>> On Tue, Jan 22, 2008 at 03:23:18PM -0500, Murphy, Tommy wrote:
>>> To my understanding the odds of this being a simulation are
>>> overwhelmingly high, so I'd be fascinated to see you expand on this.
>
>> Only if you accept certain assumptions that I, for one, do not.
>> In particular, you have to accept that future civilizations will run
>> ancestor simulations. I find it ridiculous that a civilization that
>> advanced would allow simulation of the kind of suffering that goes on
>> here on earth every day. In fact, I'd expect that any reasonable
>> culture would have something equivalent to the death penalty for such
>> actions.
>
> The idea that in a future civilization something done entirely with
> private computing could be disallowed implies a crippling lack freedom
> for individuals in said civilization.
Yep. Not seeing a problem here. Certainly if I lived in a civilization
without such rules and I found you were doing such things with your
private computing, I'd hunt you down and kill you, regardless of the
consequences to myself. It's a clear cut issue of basic morality to me.
-Robin
-- Lojban Reason #17: http://en.wikipedia.org/wiki/Buffalo_buffalo Proud Supporter of the Singularity Institute - http://intelligence.org/ http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT