From: Ben Goertzel (email@example.com)
Date: Sat Sep 30 2000 - 20:09:51 MDT
I'll nudge Adam to issue another investor newsletter
We have a few real customers now, for the text classification product; and
are on the verge of
closing a couple other big deals -- send me your private e-mail address &
I'll briefly fill you in
(I can't deduce it from the header of this s14 list e-mail)
As for the Webmind "real AI" project, which is more relevant to this mailing
list than our short-term
business success, progress is steady and exciting. We're not there yet, but
the roadmap is incredibly
clear at this point; we're now chugging through a long and irritating
process of testing what happens when
you try to run all the modules of Webmind together on a big distributed
computer network. Lots of parameter
optimization and debugging... more time, blood, sweat, and tears ... but
that's all that presently stands in the way of us starting the serious
effort of teaching Baby Webmind to think and communicate...
In terms of the sysopmind vision of AI, in my view that's Webmind 2.0.
First we get Webmind to a reasonable
level of intelligence -- ~then~ it will have the reasoning ability to
understand, improve and eventually
totally overhaul and replace its own source-code.
> -----Original Message-----
> From: firstname.lastname@example.org [mailto:email@example.com]On Behalf
> Of Brian Atkins
> Sent: Saturday, September 30, 2000 9:57 PM
> To: firstname.lastname@example.org
> Subject: Re: About that E-mail:...
> Ben, I'd just be happy if you guys started making some money and IPOed :-)
> Tell your investor newsletter guys that it's been a long dry
> summer newswise.
> (Brian the investor in Intelligenesis...)
> Ben Goertzel wrote:
> > I really doubt this is true... I don't think that anyone will
> be hunted down
> > or otherwise
> > harassed until demonstrable superhuman intelligence has been ACHIEVED
> > For instance, I've had a start-up company devoted to creating superhuman
> > intelligence for
> > 3 years now, and no one has harassed me, because NO ONE BELIEVES WE CAN
> > REALLY DO IT...
> > Once the thinking machine is demonstrated -- ~then~ we'll have
> to start to
> > worry...
> > ben
> > > -----Original Message-----
> > > From: email@example.com
> [mailto:firstname.lastname@example.org]On Behalf
> > > Of Josh Yotty
> > > Sent: Saturday, September 30, 2000 8:05 PM
> > > To: Sl4
> > > Subject: About that E-mail:...
> > >
> > >
> > > I'm willing to bet the people working toward superhuman
> > > intelligence will be hunted down. Of course, the people hunting
> > > us down will be irrational, ignorant, narrowminded and stupid. If
> > > I remember correctly, less than ten percent of the world's
> > > population can be classified as rational. (This is temperament.
> > > Check out what I mean at http://www.keirsey.com)
> > >
> > > Images of the Salem witch trials come to mind. We probably will
> > > not be safe. People will automatically think that we are trying to:
> > >
> > > A) Take over the world.
> > > B) "Purify" the world by killing most of humanity
> > > C) Any other stupid reason you can think of.
> > >
> > > Humanity, as a whole, is stupid. America is made up of media
> > > zombies who act on gossip and whims, and rumors. Having an
> > > original and true thought really hurts. In fact, Bill Joy's "Why
> > > the future doesn't need us" on Wired had a writing from Kazynski
> > > (I know that's not how it is spelled; you know, the Unabomber)
> > > that stated, basically, superhuman intelligence would either
> > > destroy us all or take away all meaning from life.
> > >
> > > Well, anyway, we might have to move to another country or
> > > gradually introduce the concept to other people or not inform the
> > > general public (either not saying anything or hiding it behind
> > > lots of technospeak, computer jargon, and large-word gobbledygook).
> > >
> > > What do you think?
> > > Josh Yotty
> > > | Orion Digital |
> > > email@example.com
> > > http://www.crosswinds.net/~oriondigital/
> > > ______________________________________________
> > > FREE Personalized Email at Mail.com
> > > Sign up at http://www.mail.com/?sr=signup
> > >
> Brian Atkins
> Director, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT