From: Marc Geddes (marc_geddes@yahoo.co.nz)
Date: Fri Mar 05 2004 - 02:33:18 MST
I'm glad to see that Sing Inst finally has an
executive director. Good luck!
Some thoughts on the future: If you're commited to
the task of Creating FAI, you need to realize that
you're in it for the long haul. Utopian visions of a
Singularity in only 5-20 years are almost certainly
fantasy (although I'd love to be proved wrong).
People at the forefront of research are almost always
far too over optimistic about how long it would take
to achieve major breakthroughs. Fundamental knowledge
about general intelligence is still missing. It is
estimated that only about 2% of everything there is to
know about cognitive science is known. Even with
exponential progress, 100% knowledge wouldn't be
reached for another 40-50 years.
The first design and coding effort of a major AGI is
not likely to succeed. Although it will be progress,
and lessons will be learned. The second generation
effort won't succeed either, although again it will
advance the field. The third generation effort...
might be closing on it. And the fourth generation
effort... that's the one that will get there ;)
I would say that a 40-60 time frame for success is
realistic. That's what I would call 'Singularity
Realism'. You have to be in it for the long haul.
Once success starts to look even remotely likely, you
will come to the attention of the government and the
public in a big way. You run the risk of regulators
coming down on you.
If Sing Inst is going to have an actual physical
location at some point, I would recommend moving to a
Libertarian oriented State, where you will be
surrounded with allies. Consider especially the state
of New Hampshire, where transhumanists and
Libertarians are gathering to form a revolutionary
base. There would be strength in numbers there.
Isolated as we are now we're sitting ducks.
I also wonder just how much Information on AGI should
be shared with the general public whilst the projects
are going on? I see that Ben is publishing quite a
lot about his project, and Eliezer has publically
published quite a lot as well. Be aware that anyone
with net access can read all that. Dictators in China
and North Korea, the odd pychopath... is it wise to
provide too much information about how to create AGI?
On the one hand, sharing can advance research, on the
other, the risk of someone creating Unfriendly A.I is
increased. And of course, shorter term A.I results
could have substantial proprietary value.
To tell you the truth, I was slightly uneasy even
posting those few very general ideas to sl4 you see in
my last couple of posts.
Most of the people working in A.I probably visit sl4
and copy everything down. They're probably ripping
off all our ideas without a second thought.
The marketing side of it needs to be a lot more
careful as well. Some things (like the Sys Op idea)
will just cause people to go ballistic. Other things,
like wild speculation about life after the
Singularity, will just cause people to dismiss it all
as sci-fi fantasy. I would never have talked about
the Singularity at all. And for God's sake don't talk
about life after the Singularity! Most people just
don't believe a word of this stuff. There is too much
hype and far too many 'slip ups' on the marketing
side. And the few non-scientifc people in the general
population who do believe this stuff are scared
shitless by it. AI will upset religious and social
norms.
All of this may sound a bit paranoid, but really Sing
Inst needs to start thinking about these things. It's
just 'Singularity Realism'.
Cheers!
=====
Please visit my web-site at: http://www.prometheuscrack.com
Find local movie times and trailers on Yahoo! Movies.
http://au.movies.yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT