Re: [sl4] What is the probability of a positive singularity?

From: Nick Tarleton (nickptar@gmail.com)
Date: Wed Jul 23 2008 - 17:42:09 MDT


On Wed, Jul 23, 2008 at 5:04 PM, Matt Mahoney <matmahoney@yahoo.com> wrote:

> Another possible scenario is that once we have the technology to reprogram
> our brains (either in-place or uploaded), that a fraction of humans won't go
> along. The brain is programmed to find the state x that maximizes utility
> U(x). In this state, any perception or thought will be unpleasant because it
> would result in a different mental state.
>

To say the brain is "programmed" to do anything really stretches the
metaphor; and more importantly, the fact that this is intuitively
undesirable suggests that the human utility function, to the extent such a
thing exists, is over histories rather than timeslices. (At least the
'utility function' of the subself writing this - other subselves might have
preferences over timeslices.)

> The fraction that realizes utopia = death, who realize that evolution is
> smarter than you are, will be the ones that pass on their genes. There is a
> good reason that humans fear death and then die, but not all of us realize
> it (including SIAI, it seems).
>

?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT