Re: Fun with Experimental Design [WAS: Re: The Conjunction Fallacy Fallacy]

From: Eliezer S. Yudkowsky (
Date: Sat May 06 2006 - 19:22:05 MDT

Ben Goertzel wrote:
> And, when you tell him the evidence his judgments are based on was
> highly biased, he may not believe you ---- because he may believe your
> judgment is flawed because you're not smart enough to assess such
> things, etc...
> In this case, among others, you might disagree with someone who you
> believe to be rational...

To be precise, you've just stated that you can have a persistent
disagreement with someone whom you believe to be rational, who does not
believe you to be rational. But this implies a further dispute over
your own rationality - you and the other have different probability
assignments about this. Do you believe that the person has biased data
in this dispute, or that he is not rationally evaluating your own

IIRC, I think one of Robin Hanson's papers tries to extend Aumann's
Agreement Theorem to prove that either you don't believe the other
person is meta-rational (rational in evaluations of self and others'
rationality), or you believe that you were born with better "ur-priors",
that is, prior probabilities before any evidence whatsoever comes in (as
opposed to priors in any particular situation, like mammographies and so
on). Robin Hanson concludes that since you have no reason to think you
were luckily born with better ur-priors, you are being irrational.
Personally I think the concept of "meta-rationality" is poorly defined,
because the notion of "trust in the probability assignments of a
cognitive process" is something I'm still trying to define rigorously

For those just tuning in, Aumann's Agreement Theorem is the base result
that shows that if perfect Bayesians have common knowledge of each
other's probability assignments (I know, you know I know, I know you
know ad infinitum) then they have the same probability assignments. The
original Agreement Theorem has been extended in dozens of different ways
by weakening various assumptions; there's a cottage industry built
around it.

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT