From: Nick Tarleton (email@example.com)
Date: Fri Sep 26 2008 - 09:16:21 MDT
On Fri, Sep 26, 2008 at 10:28 AM, Matt Mahoney <firstname.lastname@example.org> wrote:
> --- On Fri, 9/26/08, Stuart Armstrong <email@example.com>
> > > Could someone remind me again, what are we trying to
> > achieve with a singularity?
> > The survival of some version of humanity. Beyond that, the usual
> > eternal meaninful happiness and immortality stuff. Beyond that, we all
> > disagree.
> We don't agree about the first part either. A singularity could take many
> forms that result in human extinction, extinction of all DNA based life, or
> extinction of all life (the latter being the only stable attractor in an
> evolutionary process).
Well, the entire point is to see that this doesn't happen.
> At best it will result in godlike intelligence that (by definition) bears
> little resemblance to humanity, and which will be unobservable to any humans
> who are still present.
You've completely lost me; why couldn't we observe a superintelligence?
Furthermore, our quests for happiness and immortality serve to increase our
> evolutionary fitness, but only if they cannot be obtained.
This archive was generated by hypermail 2.1.5 : Wed Jun 19 2013 - 04:01:43 MDT