From: Eliezer S. Yudkowsky (email@example.com)
Date: Sat Dec 08 2001 - 20:11:22 MST
Jeff Bone wrote:
> "Eliezer S. Yudkowsky" wrote:
> > we should remember that the model the prediction
> > is based on is a model which historically has often changed and currently
> > is still in flux.
> Yawn. That's a very anti-scientific argument, Eli --- indeed one I often hear
> religious folks (like the Creationist crowd) use to undermine the credibility of
> arguments from a scientific bias.
Alas for the Fundies: Theories that have been killed, such as
creationism, always remain dead forever; science moves forward, but it
never goes back to where it was.
Alas for us: Future progress will kill off some of our own theories, too.
There's a very big difference between saying "science moves forward,
therefore your scientific disproof of my silly idea should not be trusted"
and saying "science moves forward, therefore our extrapolation of the next
billion millennia changes every fifty years".
Any attempt to establish "ultimate physical limits on technology" is a
case in point. Do you really think that an attempt to argue an ultimate
physical limit - as a reality, and not just as one model's interesting
extrapolation - should be treated under the same rules as a disproof of an
exploded flat-Earth theory? Disproving creationism or flat-Earth does not
require that our current model be absolutely complete and that it stand
for all time, merely that we take into account the overwhelming
preponderance of negative evidence against a *disproven* theory.
Establishing a physical limit is the hardest kind of result to establish,
since anything we don't know about could represent a possible loophole.
Considering the rate at which this trick has failed in times past, one is
justified in insisting that a given "physical limit" remain firm for, oh,
at least a couple of centuries, before being accepted as anything other
than an interim approximation. The historical record of "technological
impossibilities" dying off almost as fast as religious text can't be used
as an argument for the latest crackpot theory of perpetual motion, but it
does mean that we should limit the confidence level of statements about
*all possible* future technologies.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT