From: Tommy McCabe (email@example.com)
Date: Fri Mar 05 2004 - 04:37:23 MST
--- Marc Geddes <firstname.lastname@example.org> wrote:
> I'm glad to see that Sing Inst finally has an
> executive director. Good luck!
> Some thoughts on the future: If you're commited to
> the task of Creating FAI, you need to realize that
> you're in it for the long haul. Utopian visions of
> Singularity in only 5-20 years are almost certainly
> fantasy (although I'd love to be proved wrong).
> People at the forefront of research are almost
> far too over optimistic about how long it would take
> to achieve major breakthroughs. Fundamental
> about general intelligence is still missing. It is
> estimated that only about 2% of everything there is
> know about cognitive science is known. Even with
> exponential progress, 100% knowledge wouldn't be
> reached for another 40-50 years.
AI is a very difficult problem, yes, but how soon the
project is completed is a function of probably over
dozens of variables. Don't jump to conclusions.
> The first design and coding effort of a major AGI is
> not likely to succeed. Although it will be
> and lessons will be learned. The second generation
> effort won't succeed either, although again it will
> advance the field. The third generation effort...
> might be closing on it. And the fourth generation
> effort... that's the one that will get there ;)
When SIAI's AI team gets together, if you just take
the first ideas that popped into everyone's head, yes,
it probably won't succeed. But who says an AI can't be
rewritten by humans, if not by itself yet?
> I would say that a 40-60 time frame for success is
> realistic. That's what I would call 'Singularity
> Realism'. You have to be in it for the long haul.
The objective is to get to the Singularity ASAP before
the planet is destroyed, whether it 'should' happen in
5 years or 50. It will happen when people go out and
make it happen. Damn the predictions; Full speed
> Once success starts to look even remotely likely,
> will come to the attention of the government and the
> public in a big way. You run the risk of regulators
> coming down on you.
Yes, and we need to insure that these people get the
Singularity meme, not a horribly distorted version of
> If Sing Inst is going to have an actual physical
> location at some point, I would recommend moving to
> Libertarian oriented State, where you will be
> surrounded with allies. Consider especially the
> of New Hampshire, where transhumanists and
> Libertarians are gathering to form a revolutionary
> base. There would be strength in numbers there.
> Isolated as we are now we're sitting ducks.
Sitting ducks for what? It's not like Eli has
assassins after him.
> I also wonder just how much Information on AGI
> be shared with the general public whilst the
> are going on? I see that Ben is publishing quite a
> lot about his project, and Eliezer has publically
> published quite a lot as well. Be aware that anyone
> with net access can read all that. Dictators in
> and North Korea, the odd pychopath... is it wise to
> provide too much information about how to create
> On the one hand, sharing can advance research, on
> other, the risk of someone creating Unfriendly A.I
> increased. And of course, shorter term A.I results
> could have substantial proprietary value.
Damn the value, full speed ahead! Yes, it is a
tradeoff between the risk of creating UFAI and the
risk of not sharing insight.
> To tell you the truth, I was slightly uneasy even
> posting those few very general ideas to sl4 you see
> my last couple of posts.
> Most of the people working in A.I probably visit sl4
> and copy everything down. They're probably ripping
> off all our ideas without a second thought.
Ummm... Do you have any proof?
> The marketing side of it needs to be a lot more
> careful as well. Some things (like the Sys Op idea)
> will just cause people to go ballistic. Other
> like wild speculation about life after the
> Singularity, will just cause people to dismiss it
> as sci-fi fantasy.
> I would never have talked about
> the Singularity at all.
If people don't become interested in the Singularity,
how are we supposed to get donors, let alone AI
> And for God's sake
God does not exist.
> about life after the Singularity! Most people just
> don't believe a word of this stuff. There is too
> hype and far too many 'slip ups' on the marketing
> side. And the few non-scientifc people in the
> population who do believe this stuff are scared
> shitless by it.
Future shock. Taking an ordinary person and
introducing him to Staring into the Singularity will
get him quite shocked and scared, yes, if not outright
> AI will upset religious and social
That's a good thing!!!
> All of this may sound a bit paranoid, but really
> Inst needs to start thinking about these things.
> just 'Singularity Realism'.
> Please visit my web-site at:
> Find local movie times and trailers on Yahoo!
Do you Yahoo!?
Yahoo! Search - Find what you’re looking for faster
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT