Re: One or more AIs??

From: J. Andrew Rogers (andrew@ceruleansystems.com)
Date: Tue Jun 01 2004 - 00:27:15 MDT


On May 30, 2004, at 4:13 PM, Thomas Buckner wrote:
> I just read in Business Week that the Japanese have
> had the fastest supercomputers in the world since
> about 2001, with the current champion being the 40
> teraflop Earth Simulator. see story
> http://www.ecommercetimes.com/story/hardware/33796.html
>
> I know that speed alone doesn't make the machine an AI
> platform, but it's not meaningless either. I assume
> those dudes are working on AI also. The University of
> Virginia has a much cheaper 17 teraflop system
> composed of Apple computers ( $5 million and change
> compared to about $500 million for the Earth Simulator
> which does not use commercial-off-the-shelf (COTS)
> parts.)

The hardware is generally immaterial next to the specifics of the AGI
implementation. It is one of the reasons one is not likely to
accidentally implement an AI of any type, even if one tries really
hard.

10^12 iops may be enough for Human Equivalent Intelligence (HEI) on a
nearly ideal implementation -- it is hard to say without doing it
first. The problem is that a slightly less than ideal implementation
may require 10^18 iops to be HEI. And there is a huge range of viable
designs that may require 10^40 iops or more to be HEI. It is an
unforgiving space and a lot of people are working on designs that fall
in these latter cases. They may theoretically be genuine functional
AGI designs, but not in practice.

Flogging a pony that is pining for the fjords: Memory latency is the
limitation of modern hardware, not teraflops. In fact, there is no
great necessity for floating point at all; that is primarily of use to
the people doing "biologically inspired" AI research. There is very
little "computation" in intelligence when you get right down to it.
And nothing about this algorithm space at large suggests that it is
well-suited for cluster computing. Clusters have very limited
application to AGI, as AGI looks a lot more like a genuine
supercomputing application (there is a reason there is still a strong
market for genuine supercomputers rather than just doing everything
with clusters).

So don't dwell on how big or fast hardware is becoming, as it doesn't
really matter in the big picture. Excellent design trumps good design
and gobs of hardware in this particular case.

j. andrew rogers



This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:38 MST