RE: About that E-mail:...

From: Michael LaTorra (
Date: Sat Sep 30 2000 - 19:47:47 MDT

I think Ben is correct that no one will take serious action against
developers of AI unless or until they seem to be achieving success.

All the more reason for being circumspect about what you may have achieved
until it is too late for anyone else to stop.

No open source code on this project!

Michael LaTorra

-----Original Message-----
From: []On Behalf
Of Ben Goertzel
Sent: Saturday, September 30, 2000 7:36 PM
Subject: RE: About that E-mail:...

I really doubt this is true... I don't think that anyone will be hunted down
or otherwise
harassed until demonstrable superhuman intelligence has been ACHIEVED

For instance, I've had a start-up company devoted to creating superhuman
intelligence for
3 years now, and no one has harassed me, because NO ONE BELIEVES WE CAN

Once the thinking machine is demonstrated -- ~then~ we'll have to start to


> -----Original Message-----
> From: []On Behalf
> Of Josh Yotty
> Sent: Saturday, September 30, 2000 8:05 PM
> To: Sl4
> Subject: About that E-mail:...
> I'm willing to bet the people working toward superhuman
> intelligence will be hunted down. Of course, the people hunting
> us down will be irrational, ignorant, narrowminded and stupid. If
> I remember correctly, less than ten percent of the world's
> population can be classified as rational. (This is temperament.
> Check out what I mean at
> Images of the Salem witch trials come to mind. We probably will
> not be safe. People will automatically think that we are trying to:
> A) Take over the world.
> B) "Purify" the world by killing most of humanity
> C) Any other stupid reason you can think of.
> Humanity, as a whole, is stupid. America is made up of media
> zombies who act on gossip and whims, and rumors. Having an
> original and true thought really hurts. In fact, Bill Joy's "Why
> the future doesn't need us" on Wired had a writing from Kazynski
> (I know that's not how it is spelled; you know, the Unabomber)
> that stated, basically, superhuman intelligence would either
> destroy us all or take away all meaning from life.
> Well, anyway, we might have to move to another country or
> gradually introduce the concept to other people or not inform the
> general public (either not saying anything or hiding it behind
> lots of technospeak, computer jargon, and large-word gobbledygook).
> What do you think?
> Josh Yotty
> | Orion Digital |
> ______________________________________________
> FREE Personalized Email at
> Sign up at

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT