Re: Why playing it safe is the most dangerous thing

From: Peter de Blanc (peter.deblanc@verizon.net)
Date: Fri Feb 24 2006 - 10:15:39 MST


On Fri, 2006-02-24 at 06:04 -0500, Ben Goertzel wrote:
> * If we launch a Singularity before the jerks in power figure out
> what's up, we have a 50/50 or so chance of a good outcome (by the
> Principle of Indifference, since what happens after the Singularity is
> totally opaque to us lesser beings)

But a 50/50 outcome is not the null hypothesis. If the Singularity is
really totally opaque to us, then we should imagine that all universe
states are equally probable post-Singularity. The vast majority of
possible configurations of matter do not contain human life, so the null
hypothesis is that humans almost certainly cease to exist.

The Singularity is not totally opaque to us because we expect it to be
the result of goal-oriented cognition, so we can expect the post-
Singularity world to maximize some utility function. If you're blindly
rushing into the Singularity and not trying for FAI, then you have very
little knowledge about that utility function - it may as well be random.

I have a hard time believing that a randomly-selected utility function,
when maximized, could result in human life, because humanity is very
complex.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT