From: Tommy McCabe (rocketjet314@yahoo.com)
Date: Wed Dec 31 2003 - 17:14:58 MST
--- Randall Randall <randall@randallsquared.com>
wrote:
>
> On Wednesday, December 31, 2003, at 12:40 PM,
> Michael Anissimov wrote:
> >
> > "Combining a few issues here. I believe that
> strong superintelligence
> > is possible. Furthermore, I believe that to argue
> to the contrary is
> > amazingly rank anthropocentrism, and should be
> laughed at. Beyond
> > that, I think full AI is possible. It's the
> combination of the two
> > that's interesting."
> >
> > I agree that people who believe strong
> superintelligence is impossible
> > are memetically distant enough from
> Singularity-aware thought that
> > trying to avoid offending/confusing them is
> pointless.
>
> This is not at all true. I think it's quite
> arguable that
> strong superintelligence is impossible. For
> instance, it
> might be that physical law doesn't permit complexity
> above
> some ceiling. If so, it might be that the smartest
> possible
> humans are already very close to that limit.
The difference between 'the smartest possible huamns'
and 'the dumbest possible humans' is incredibly tiny
in the space of all minds in general. There is quite
possibly some hard upper limit to intelligence, but
for it to be exactly in the incredibly narrow range of
minds represented by Homo sapiens sapiens would be
almost an absurdity.
> This might not seem very likely to you, but unless
> you
> can logically rule it out, it isn't incompatible
> with
> a technologically induced singularity (in the weak
> sense).
>
> --
> Randall Randall
>
__________________________________
Do you Yahoo!?
Find out what made the Top Yahoo! Searches of 2003
http://search.yahoo.com/top2003
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT