From: Damien Broderick (thespike@satx.rr.com)
Date: Mon May 01 2006 - 13:09:02 MDT
At 09:45 AM 5/1/2006 -0700, Bob Seidensticker wrote:
>Michael: it sounds like you think technologies like nanotech and AGI are not
>only inevitable but close. Why do you say that, given the poor record of
>the futurist community in predicting the future? You know the long list of
>failed predictions as well as I do -- moon bases, videophones, and so on.
Don't you see how ludicrous this comparison is? Why not go the whole
hog and mock the absence of gigantic zeppelins cruising the skies at
an amazing 120 m.p.h.!! The immense wooden sailing craft that could
circle the world in only months!!!
We don't have moon bases for the same reason we're not recovering
from spasm nuclear war, another frequent image in '50s sf. It turned
out not to be good strategy. Bear in mind that Clarke's idea for
geostationary radio sats were vast objects filled with human
switch-board operators. The difference between then and now is
precisely the driver that pushes the singularity. Instead of monkeys
in local space, we have light-weight, powerful probes to the edge of
the solar system sending back scads of images and data. Instead of
large fixed videophones, we have thin pocket cellphones that take pix
and transmit them instantly to anywhere in the world. The hazard of
prediction isn't overheated imagination--it's an inevitable failure
to see how realistic goals will be attained by other, more
technically advanced means, and that indeed some goals will be
abandoned as foolish, short-sighted and wasteful.
MNT might turn out to be accomplished by tweaked RNA rather than
mechanosynthesis, but if the end result realizes our wishes, so what?
Your critique of Sunday journalist "futurism" is accurate, but nobody
in this forum needs to be told that. If you wish to reprove Vernor
Vinge, say, for his lack of understanding, you'll need to address his
now-classic arguments from 20 years ago concerning the likely impacts
of self-incrementing AI (whenever it happens). It's interesting that
Vinge's name doesn't appear even once in the index of FUTURE HYPE,
nor does "technological singularity". Nor, for that matter, does
Yudkowsky's. It's time to stop taking pot-shots at the lazy anoxic
fish in the brine barrel and go instead after the big sharks.
Damien Broderick
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT