Re: An essay I just wrote on the Singularity.

From: Robin Lee Powell (rlpowell@digitalkingdom.org)
Date: Wed Dec 31 2003 - 17:39:29 MST


On Wed, Dec 31, 2003 at 04:14:58PM -0800, Tommy McCabe wrote:
> --- Randall Randall <randall@randallsquared.com> wrote:
> > This is not at all true. I think it's quite arguable that
> > strong superintelligence is impossible. For instance, it might
> > be that physical law doesn't permit complexity above some
> > ceiling. If so, it might be that the smartest possible humans
> > are already very close to that limit.
>
> The difference between 'the smartest possible huamns' and 'the
> dumbest possible humans' is incredibly tiny in the space of all
> minds in general. There is quite possibly some hard upper limit to
> intelligence, but for it to be exactly in the incredibly narrow
> range of minds represented by Homo sapiens sapiens would be almost
> an absurdity.

I think Randall's point is that if there is an upper bound, humans
might be as smart as we are because we've already hit it.

-Robin

-- 
Me: http://www.digitalkingdom.org/~rlpowell/  ***   I'm a *male* Robin.
"Constant neocortex override is the only thing that stops us all
from running out and eating all the cookies."  -- Eliezer Yudkowsky
http://www.lojban.org/             ***              .i cimo'o prali .ui


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT