From: justin corwin (firstname.lastname@example.org)
Date: Mon Apr 24 2006 - 12:43:38 MDT
On 4/24/06, Eliezer S. Yudkowsky <email@example.com> wrote:
> I wouldn't go that far. I would say that it sounds plausible to
> computer scientists but conflicts with experimental observations in the
> brain sciences and in evolutionary biology.
This is not necessarily the case. It could be so, if you assume that
there is indeed only an infinitesimal difference between us and
OR, more plausibly, it could apply only to the boundary capabilities
of intelligence in general, where humans and chimpanzees represent
implementations nowhere near "optimal", and thus can have vast
differences in 'performance' without conforming to the proportional
computational resource differential. Chimpanzees and Neandrathals
could very plausibly be simply missing architectural elements that
reduce computational cost for intelligent behavior.
The problem with these kinds of arguments, is that the terms imported
from say, algorithm analysis, don't actually deal with any of the
interesting complexity of the issue. If linear increases in
intelligence can be assumed to be continous(continous how? relative to
what?) what is the relationship to computational cost? Well, what kind
of computational cost? Just running the mind? What about perception?
(we use a lot of neurons to see, some of those might be wasted towards
general intelligence) What about special analysis? What about just
sitting there thinking, or remembering?
The whole thing strikes me as very blue-sky, either position. Although
Eliezer's joke was very funny, I don't think it was very illustrative.
-- Justin Corwin firstname.lastname@example.org http://outlawpoet.blogspot.com http://www.adaptiveai.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT