Re: The inevitable limitations of all finite minds....

From: J. R. Molloy (jr@shasta.com)
Date: Fri Nov 24 2000 - 22:06:18 MST


Ben Goertzel wrote,
> "Requires the attention of all sane human beings..." is pretty strong...

You're right about that. I tend to get carried away sometimes. No doubt
plenty of perfectly sane people will not be required to think about
technological singularity, and probably shouldn't.
I was trying to express my feeling that something as all-encompassing as a
real, great big honking Singularity will require all the talent,
intelligence, wisdom, and sanity that humanity can muster.

> I think that most people are not sufficiently open-minded to be
convinced to
> take a notion
> like the Singularity seriously

Now there you go again, calling it a "notion." I admit that many other
possible/probable scenarios may render the singularity just a notion
(someday), but to the degree that we can see the singularity as the most
satisfactory response to urgently needed change, should we continue to
view it as a notion? I mean, before the Manhattan Project, atomic energy
was just a notion.

> Opening up human minds is a huge task in itself, and the place to start
if
> one wants to take
> on this task is probably not with the Singularity...

Right. The Singularity will be the *last* thing to open up human minds,
just before it disassembles them to use their atoms for increased SI
memory demands.

> Once Webminds, Eliezer-minds, or other reasonably intelligent AI's are
out
> there, the public
> consciousness will get used to AI, and then the Singularity will become
more
> "thinkable" among
> the majority of people...

That sounds like a very calm, reasonable, considered, and careful
approach. If, however, the Singularity is something other than a notion,
then it seems to me what the majority of people need is a strong shock to
wake them up to the accelerating advance of a life form as different from
us as we are from bacteria. As I see it, the Singularity (the quickening,
hyperplexity, the fourth stage of evolution, etc.) reveals itself
beforehand to people who are already wide awake and a little bit edgy. In
addition, since the Singularity will incorporate (in more ways than one)
superintelligence, it will be able to solve problems much better and
faster than humans or teams of humans can do so. In other words, it will
be able to solve any public relations problems better and quicker than the
public can invent them.

So what's at issue here is not how to present the Singularity to a
reluctant public, but rather how to bring about the Singularity before the
truly unthinkable happens, namely humanity blowing itself up in yet
another global holocaust born of tribal conflicts.

The limitations of a superintelligence will probably be more clear to the
SI than to us, no matter how well we may console ourselves that the SI
does in fact have limitations. Just for kicks, ask yourself if the
Singularity and its attendant SI is so smart, won't it be able to figure
out how to make itself "thinkable" among the majority of people...

Stay hungry,

--J. R.
3M TA3

"It's not your vote that counts,
it's who counts your vote."
--Al Gore
(Or was it Joseph Stalin... Hitler? Oh well, one of those socialists.)



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT