RE: An essay I just wrote on the Singularity.

From: Mitchell Porter (
Date: Wed Dec 31 2003 - 04:24:19 MST

>From the essay:

>a Friendly AI must be the first being to develop strong nanotech on Earth,
>or one of the first, or we are all going to die in a mass of grey goo

Something about this strikes me as wrong-headed. Since
grey goo is considerably closer in time than Friendly AI,
an imperative like this will lead people to focus on stopping
"strong nanotech", rather than creating Friendly AI.

I don't think we have the luxury of assuming that "Friendly
AI before strong nanotech" is very likely. The human race
will do what it can to minimize the risks and plan for bad
contingencies, on the model of today's biodefense/biosecurity
strategies (which address, not just biowarfare, but also
naturally emerging diseases). The advent of nanotechnology
does not lead immediately to extinction, and will certainly
accelerate AI (not necessarily Friendly), because of the
increased capacity to study and emulate the human brain,
and the ability to make nanocomputing grids of arbitrary size.

Also: "strong nanotechnology" has in the past denoted any
form of molecular manufacturing, whether feedstock-driven
or "free-living", with "weak nanotechnology" being all the other
nanoscale stuff. I think it's one of umpteen terms that
Foresight has invented in order to distinguish the assembler
vision from other activities claiming the name of nanotechnology.

ninemsn Premium transforms your e-mail with colours, photos and animated
text. Click here

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT