Re: [sl4] How much do languages matter for AGI programming?

From: Charles Hixson (charleshixsn@earthlink.net)
Date: Sat Nov 29 2008 - 12:10:05 MST


Edward Miller wrote:
> I was just reading about programming methods that I had never heard of
> before such as Aspect-oriented programming and Subject-oriented
> programming. I was thinking what consequences the programming language
> has for AGI.
>
> I know there are lots of flame wars regarding the superiority of
> programming languages, and the Artificial Intelligence community has
> been arguing over it for years. Some prefer Logic Programming, while
> others like Marvin Minsky prefer good ol procedural programming.
> Finally, I believe Eliezer has recommended Java, correct me if I am wrong.
>
> I was looking over a lot of the criticisms of all these languages, and
> it seems to become much more serious when you think about what it
> could mean for AGI. Yet, even within a particular programming paradigm
> there is much variation. I am reminded of the people who prize C#
> because it is impossible to have buffer overflows and so forth.
> Ironically, Microsoft is coding a new OS in C# named Singularity.
> Perhaps these sorts of problems are relevant though.
>
> Is there any way to know which would be the best for specifically a
> recursively-improving AGI? Or does it not matter what the base
> language is because the AGI will just evolve new languages? That would
> sort of assume that there is really only one perfect type of
> intelligence that all roads lead to, which I am not so sure about...
> (I am imagining Eliezer's minds-in-general diagram). This makes the
> programming paradigm all the more relevant.
>
> I would recommend an open source language, since you definitely want
> all the code. How can the AGI recursively improve if it doesn't have
> full access to it's own code down to the compiler? I can't say I'm an
> expert on any of these new languages. I would recommend not
> overlooking obscure ones though, like the D programming language.
> Granted, if solving this problem requires learning or inventing some
> new obscure programming method, this would be a burden, but maybe a
> necessary one.
This is largely a matter of taste, however:

Most computer languages are as complete as Number Theory for numbers up
to a certain finite size. Currently 2^63-1 is the most common limit.

OTOH, different languages make it easier to think of different
implementations. This is arguably significant. But for design.
Implementation in some languages is easier than in others, but these
languages tend to lose that advantage because the implemented code is
slower. For this combination of reasons my preferred language is D,
which has most of the design advantages without having the cost at
execution time. (OTOH, D is dependent upon C libraries...well, it's a
new language, and that's only a significant drawback when the libraries
expect to make callbacks.)

D does have two open source implementations, though one is lagging, and
seems unmaintained and the other isn't yet complete. Still, their
existence proves that it could be maintained independent of the
originator (Walter Bright).

A better language for basic design is probably Python, but it's fatally
slow at executing large programs. I'm not considering Java, as I
consider that it's both slower than a good compiled language and
clumsier to use than Python or Ruby. (Actually I consider Java clumsier
than D. If D didn't exist I'd probably be recommending Ada, despite
it's lack of built-in garbage collection, and the awkwardness of string
literals of different lengths being of different types. [Ada is of an
older paradigm, and their UnboundedString representation is, possibly
intentionally, made more difficult to use than strings of defined length.])

Well, Ada 2006 has supposedly cleared up many of the warts...but they
still haven't implemented garbage collection as a standard feature of
the language (i.e., easy to use and dependably present), and D *does*
exist, so I haven't looked at it again. Ditto for Fortran 2007(?). And
C++ 2008(or whatever they're calling it). A good compiled Java might be
worth looking into, but I really find the Java IO libraries a nuisance.
Still, Java has LOTS of libraries, and that might be a sufficient
benefit...but it would require the speed that can only be obtained with
compiled code. (There are parts of an AI that would benefit more from
the flexibility of interpreted code than from the speed of compiled
code, but not as the base layer, and for those Python has a clear
lead...and a well defined linkage convention that lets it use the C
calling convention. Or just write the interface layer in Pyrex...which
is an extended subset of Python that compiles to C. [But don't use it
for large chunks. That's not what it was designed for.])



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT