From: Russell Wallace (russell.wallace@gmail.com)
Date: Fri Jun 17 2005 - 09:54:59 MDT
On 6/17/05, p3 <rm3cpp@yahoo.com> wrote:
> I don't understand why the development of molecular
> nanotechnology will mean the inevitable destruction of
> all things everywhere (on earth, at least)
It won't in and of itself - essentially, time and evolution will do
that; but what nanotechnology will do is, by making matter as
malleable as software, greatly accelerate the timescales involved.
> or why the
> development of smarter-than-human intelligence will
> somehow avoid this disaster.
Again, AI won't in and of itself avoid disaster, but it appears to be
a prerequisite tool for any workable strategy for disaster avoidance
in the long term; here's my attempt at explaining why:
http://homepage.eircom.net/~russell12/dp.html
- Russell
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:57 MST