Re: An essay I just wrote on the Singularity.

From: Tommy McCabe (rocketjet314@yahoo.com)
Date: Wed Dec 31 2003 - 18:43:30 MST


--- Randall Randall <randall@randallsquared.com>
wrote:
>
> On Wednesday, December 31, 2003, at 07:39 PM, Robin
> Lee Powell wrote:
>
> > On Wed, Dec 31, 2003 at 04:14:58PM -0800, Tommy
> McCabe wrote:
> >> --- Randall Randall <randall@randallsquared.com>
> wrote:
> >>> This is not at all true. I think it's quite
> arguable that
> >>> strong superintelligence is impossible. For
> instance, it might
> >>> be that physical law doesn't permit complexity
> above some
> >>> ceiling. If so, it might be that the smartest
> possible humans
> >>> are already very close to that limit.
> >>
> >> The difference between 'the smartest possible
> huamns' and 'the
> >> dumbest possible humans' is incredibly tiny in
> the space of all
> >> minds in general. There is quite possibly some
> hard upper limit to
> >> intelligence, but for it to be exactly in the
> incredibly narrow
> >> range of minds represented by Homo sapiens
> sapiens would be almost
> >> an absurdity.
> >
> > I think Randall's point is that if there is an
> upper bound, humans
> > might be as smart as we are because we've already
> hit it.
>
> Yes, this was exactly my point.
>
> Mind you, I don't have a lot of reason to think that
> it's the
> case, but I do subscribe to the lesser position that
> humans are
> nearly at the limit of intelligence for the kind of
> architecture
> we run on. I have little evidence of it, but none
> of the
> alternative.

Why should we be nearly at the limit? What a
conincidence, that the very first species to hit
general intelligence would also be nearly at the limit
of the architecture they run on.

__________________________________
Do you Yahoo!?
Find out what made the Top Yahoo! Searches of 2003
http://search.yahoo.com/top2003



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT