Re: Open Source Friendly AI? (was Singularity and the generalpublic)

From: Brian Atkins (brian@posthuman.com)
Date: Fri Apr 13 2001 - 00:17:31 MDT


James Higgins wrote:
>
> At 11:41 PM 4/12/2001 -0400, Decan McCullagh wrote:
> >On Fri, Apr 06, 2001 at 08:33:09PM -0400, Eliezer S. Yudkowsky wrote:
> >Secrecy, even discussions of it, will increase their fear and distrust of
> >you. Your only option is to play a game of chess, where the moves are open,
> >rather then poker.
>
> I quote the above merely because it gave me the spark to think about the below.
>
> Has there been any serious discussion about making this an open source
> project? Instead of debating how open to be, if/when to hide, etc. maybe
> you should consider the exact opposite. I believe it has many advantages.

Yep we've thought about it

>
> 1) It becomes nearly impossible (definitely impractical) to stop the work
> since everyone has access to it and could continue to build upon it if the
> original authors could not continue.

Maybe, but why bother unless you have to? Perhaps we'll set up a remote
location with a copy of all our work to be released just in case the
government cracks down (highly unlikely...) on us. Isn't it sad that we
even have to worry about such stuff happening?

>
> 2) It would pull in some of the rogue groups who would go it alone.

I don't see how this would help. By releasing our code and ideas we actually
encourage/help along splinter groups. If we keep the code and ideas more
private then they have to come and chat with us if they don't want to try
to reinvent the wheel.

>
> 3) Open Source could massively speed up the process. Instead of having a
> few coders working on it, thousands or more would be able to
> contribute. (with very high quality control, of course)

Actually open source has so far in general proven itself to be a much slower
method of software development. And it gets worse on larger/more complex
projects. How many hackers do you think will really be able to contribute
much to an AI project? Many fewer than can contribute to something like
Mozilla, and look how slow that thing has gone.

>
> 4) Probably the #1 biggest benefit is improved quality. Open Source in
> many ways is the pinnacle of code reviews. Having so many ideas study the
> source would reveal far more errors and problems than an isolated team
> could ever accomplish.

Actually I've seen that Linux has more bugs reported on Bugtraq than any
other operating system. Is this because it is buggier, or because more
stuff gets found? Perhaps someday Mozilla will become better than Internet
Explorer in terms of stability, but for now the closed source approach
using highly skilled programmers has worked better.

>
> 5) Providing a common, open source Friendly AI system would allow other
> groups who insist on pursuing this themselves to incorporate your friendly
> tech.

Or allow Saddam Hussein to get his evil AI up and running that much faster.
To be realistic, real AI is an extremely powerful technology, and our view
is to not hand it over to people we don't know and trust.

>
> If your ultimate goal really is to get to the singularity as soon as
> possible, before a non-friendly singularity can occur I think this is an
> ideal path to follow.

We disagree, and would only pursue such a pathway as a last resort.

-- 
Brian Atkins
Director, Singularity Institute for Artificial Intelligence
http://www.intelligence.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT