From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Wed Jun 02 2004 - 10:33:44 MDT
Michael Wilson wrote:
>>You are positing that all the "really good people"
>>basically agree with the SIAI perspective.
>
> Yes, for the same reason that I posit all the really good
> physicists accept general relativity as the starting point
> for further work. If you think that AI is an area where
> different points of view can be equally good, you have
> already failed.
Er, *that's* going way too far. We should remember that it is always
*okay* for smart people to agree. It does not mean that all the really
smart people will agree with the SIAI perspective right off the bat. In
fact, I don't see how they could. LOGI's obsolete, and it's all we've got
online. A really, *really* smart person will presumably read over LOGI,
snort, and say "Come back when you know about Bayes, whippersnappers."
I do agree that this business of agreeing to disagree is foolish (and
provably non-Bayesian), and really good people surely know this. Let us
not forget that at most one view is correct, the others being wrong, and if
you aren't ready to say your view is correct, it probably isn't.
But for all that I have begun to understand a few things, I don't have
within six orders of magnitude the weight of reason accumulated in favor of
General Relativity. And what's there isn't even online!
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT