RE: Individualism in Transhuman Society.

From: Ben Goertzel (ben@webmind.com)
Date: Sat Nov 18 2000 - 15:00:27 MST


> First, I should give a little background about the kind of singularity I
> am expecting. My intuition is that whatever "magic" advances in physics
> provide, the laws of thermodynamics and the speed of light will remain,
> and that these will be sufficient to keep a true singularity from
> occurring.[1] (Regardless, the maximum rate of change on the S curve
> that occurs will be pretty extreme, so it could be considered a
> singularity in the colloquial sense.)

It seems to me that this paragraph represents a misunderstanding of the
notion
of a Singularity as Eliezer means it.

I don't see any evidence that the known laws of physics prevent
self-modifying AI
systems from achieving amazingly superhuman, exponentially accelerating
degrees of
intelligence...

I guess this could be true, if Penrose and so forth are right that
mysterious freaky
quantum gravity effects (as allegedly present in the brain) are a necessary
component
of intelligence.... But, I doubt this is true... there is as yet no
evidence. My own
intuition is that this claim is fundamentally illogical and ill-founded --
but I've argued
this in detail in my chapter on "self-generating systems" in Chaotic Logic,
and I certainly
failed to convince, Penrose, Gyorgy Kampis, and others of that ilk...

ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT