From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Apr 24 2006 - 10:07:20 MDT
Philip Goetz wrote:
> My intuition, based on experience with how much computational power it
> takes to solve a problem of a particular size, and on Rescher's law of
> logarithmic returns, is that exponentially-increasing computational
> power is required to provide linear increase in "smartness", or some
> measure of the problems we can handle. For instance, finding primes
> of 2N bits takes much more than twice the computational power of
> finding primes of N bits.
>
> I also expect that the computational complexity of cognition is an
> exponential function of the size of working memory, so that if we
> currently have a working memory that can store 5 chunks, the amount of
> computation available in the universe limits us to some double-digit
> number of chunks.
As we all know, humans required thousands of much times as much brain
tissue as chimpanzees to produce only a small increment in performance;
if you look around on the street, you can easily see that each
additional 10 IQ points requires a rough doubling of cranial volume. If
you still think the ascent of AIs will be rapid, a further caution is
provided by the evolutionary history of the hominid family: After
requiring only 50,000 years to go from Australopithecus to late Homo
erectus, it then required another five million years to produce Homo
sapiens. Most of what we think of as impressive benefits and major
impacts of "human" intelligence, such as guns and nuclear weapons, were
invented by monkeys twenty million years ago. Our closest cousins, the
chimpanzees, have most human abilities - including combinatorial
language, machines with moving parts, and crude scientific journals -
which also suggests that it is unlikely for any particular AI project to
get many major abilities in advance of other AI projects. The ongoing
military and economic competition between Cro-Magnons and Neanderthals
has stalemated for millennia; slight improvements in brainpower simply
do not amount to all that much in the real world.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT