From: Richard Loosemore (firstname.lastname@example.org)
Date: Fri May 12 2006 - 07:33:38 MDT
Joshua Fox wrote:
> Eliezer S. Yudkowsky wrote:
>> .... the way that some people apply their theories of human
>> irrationality to the larger processes of cognition (in this case,
>> judgments of future scenarios), and come to conclusions as if teh
>> Conjunction Fallacy, and the general lack of logical reasoning skills,
>> were the main determinants in those analyses.
> Because flaws in rationality such as the Conjunction Fallacy are not
> explicitly adaptive to the human evolutionary environment, might it be
> that these are necessary side-effects of practical General Intelligence?
> Every design requires compromise in some parameters, and perhaps this is
> an /essential /compromise for the first generation of AGI.
Just for the record: Eliezer did not write the above comments: I did.
Sadly, I am no longer engaging in discussions on this list. I have
wasted too much time trying to pursue meaningful discussion of AGI, only
to find myself fighting the irrationality of a small but noisy clique
who belong to the Bayesian Rationality religion/cult/pseudoscience.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT