From: Joshua Fox (firstname.lastname@example.org)
Date: Fri May 12 2006 - 02:23:59 MDT
Eliezer S. Yudkowsky wrote:
> .... the way that some people apply their theories of human
> irrationality to the larger processes of cognition (in this case,
> judgments of future scenarios), and come to conclusions as if teh
> Conjunction Fallacy, and the general lack of logical reasoning skills,
> were the main determinants in those analyses.
Because flaws in rationality such as the Conjunction Fallacy are not
explicitly adaptive to the human evolutionary environment, might it be
that these are necessary side-effects of practical General Intelligence?
Every design requires compromise in some parameters, and perhaps this is
an /essential /compromise for the first generation of AGI.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT