RE: Open Source Friendly AI? (was Singularity and the generalpublic)

From: Patrick McCuller (
Date: Fri Apr 13 2001 - 12:37:17 MDT

> On Fri, Apr 13, 2001 at 02:17:31AM -0400, Brian Atkins wrote:
> > I don't see how this would help. By releasing our code and ideas
> we actually
> > encourage/help along splinter groups. If we keep the code and ideas more
> > private then they have to come and chat with us if they don't want to try
> > to reinvent the wheel.
> You could release it under a license that strongly discourages, if not
> outright disallows, splinter groups.

        This would not be an 'open' license. It might get the source code out there,
but it removes the point of open source. Open source, way back when, was all
about splinter groups. It was all about, and is still all about, customizing
and building on existing software without the need to completely reinvent.
Prevent splinter groups, and where are you? You're giving away source code.

        And let's not kid ourselves about where it goes. License be damned, once
source code is out in the world it goes everywhere. I know of two commercial
products that use some amount of 'open source' code without any
acknowledgement, and I'm certain that that is a common practice.

        In all, what's accomplished in the positive side is very little. It won't add
much to the code base, if anything, and it destroys the intellectual property
of the contributors, which if this project is not successful, may at least be
worth something as a consolation prize.

> > Actually I've seen that Linux has more bugs reported on Bugtraq than any
> > other operating system. Is this because it is buggier, or because more
> > stuff gets found? Perhaps someday Mozilla will become better than Internet
> Probably the latter.

        As a highly experienced programmer who has worked on many 'closed' software
projects and several open source projects, I feel strongly that it is
definately both, but primarily the former.

        Open source software has a tendancy to collect weak code. The dynamics of why
this happens is open to debate, but it definately happens. Shrink wrapped
software usually has code controls that force programmers to make, if not
bug-free or perfect software, at least reasonably readable and maintainable
code. Don't both pointing out exceptions to me; I know that most software
projects fall far short of this ideal. My contention is that even more open
source projects suffer from these problems.

        I can't even express my dismay the first time I tried to improve gcc. It had
a really obvious bug that I was sure I could fix, but when I opened up the
source code, I was stunned. Absolutely stunned that the thing works at all.
Tens of thousands of undocumented lines of spaghetti C code. There was nothing
I could do without spending dozens of hours tracing even the startup

> > To be realistic, real AI is an extremely powerful technology, and our view
> > is to not hand it over to people we don't know and trust.
> What, do you think that corporate spies or government ones won't
> be able to acquire your code on way or another? If it's that powerful,
> someone will resort to physical violence to get it. Are your principles
> so strong that you would refuse to give up the source if a family member
> were kidnapped and held for ransom?
> If it's that powerful, people will want it. And I daresay they'll get it.

        They may not even realize it's there. The US government alone has sunk
billions into AI, and they got squat out of it. As far as they're concerned,
practically speaking, it's a dead issue. The singularity will happen before
they ever realize they're not in control anymore.

> -Declan

Patrick McCuller

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT