Re: State of the SI's AI and FAI research

From: Matt Arnold (
Date: Wed Feb 16 2005 - 14:04:23 MST

Your comments remind me very much of me. I find saying things like
"proportion the strength of convictions to the strength of the
evidence" is easy for me to say, but not so easy to put into practice
consistently. Seeking out the strongest contradictory hypotheses and
alternative explanations, and attempting to understand their
perspectives, is a daunting effort. I hate to admit it, but it's been
too much for me to keep up day in and day out. I don't want to act as
if just because I care about preconceptions and biasing factors, I am
now immune. I don't really know how to regain the mental flexibility
that I used to have.

On Wed, 16 Feb 2005 08:49:48 -0800, Eliezer Yudkowsky
<> wrote:
> Anyone who doesn't realize they might need to radically revise their
> thinking, and then needs to do so, isn't calibrated.
> Anyone who humbly disclaims that they might need to revise their thinking,
> behaves in more or less the same way regardless, and then needs to
> radically revise, makes a subtler mistake.
> --
> Eliezer S. Yudkowsky
> Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT