Re: analytical rigor

From: Eliezer S. Yudkowsky (
Date: Fri Aug 25 2006 - 15:14:21 MDT

Richard Loosemore wrote:
> Eliezer S. Yudkowsky wrote:
>> Just for the record, my main issue with Loosemore so far is that I'm
>> still waiting for a bold, precise, falsifiable prediction - never mind
>> a useful technique - out of his philosophy. So you don't know a bunch
>> of stuff. Great. Personally, I don't know the position of every atom
>> in Pluto. What *do* you know, and why is it useful?
> Eliezer,
> Falsifiable predictions are not the issue, and I think you know that: I
> have said before (very clearly) that this is a question at the paradigm
> level.
> You have read enough that I am sure I do not need to educate you on the
> difference between paradigm-level issues and normal-science issues.

Yes. Paradigm-level revolutions generate much bolder, much more
falsifiable predictions.

> If this were a debate about particular results within a science, your
> request for falsifiable predictions would be justified. But because you
> *know* full well that I have made my statements at the paradigm level
> -- in other words, for people who might be reading this and do not know
> what I mean by that, I am attacking the foundational assumptions and the
> methodology of the mainstream AI approach -- your request for a bold
> precise, flasifiable prediction is specious.
> [I have said this in the past, and if I recall correctly all I got in
> reply was a dismissive comment that said something like "when someone
> doesn't have anything concrete to say, of course they always trot out
> the "paradigm" excuse". I sincerely hope this does not happen again.]


Dear Richard Loosemore:

When someone doesn't have anything concrete to say, of course they
always trot out the "paradigm" excuse.

Eliezer Yudkowsky.

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT