Re: [sl4] Our arbitrary preferences (was: A model of RSI)

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Fri Sep 26 2008 - 08:28:46 MDT


--- On Fri, 9/26/08, Stuart Armstrong <dragondreaming@googlemail.com> wrote:
> > Could someone remind me again, what are we trying to
> achieve with a singularity?
>
> The survival of some version of humanity. Beyond that, the usual
> eternal meaninful happiness and immortality stuff. Beyond that, we all
> disagree.

We don't agree about the first part either. A singularity could take many forms that result in human extinction, extinction of all DNA based life, or extinction of all life (the latter being the only stable attractor in an evolutionary process). At best it will result in godlike intelligence that (by definition) bears little resemblance to humanity, and which will be unobservable to any humans who are still present.

Furthermore, our quests for happiness and immortality serve to increase our evolutionary fitness, but only if they cannot be obtained.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT