Re[2]: continuity of self [ was META: Sept. 11 and Singularity]

From: Cliff Stabbert (
Date: Sun Sep 15 2002 - 18:40:27 MDT

Sunday, September 15, 2002, 7:19:38 PM, Eliezer S. Yudkowsky wrote:

ESY> Cliff Stabbert wrote:
CS> How big would the temptation be for any current superpower to grab
CS> the first workable nanotech or the first usable general AI and use
CS> it to wield power over others?

ESY> Stealing the first workable nanotech is one thing. "Stealing" a
ESY> general AI is a bit different from stealing its physical
ESY> hardware. AI is not a tool. It is a mind. Moving an AI from
ESY> one place to another doesn't make it a tool in your hands, any
ESY> more than moving Gandhi from India to Germany causes him to
ESY> become a Nazi. Now of course potential thieves may not know
ESY> that, but if so, failing to keep track of the distinction
ESY> ourselves is hardly conducive to enlightening them.

You're right; but my concern in the case of AI tech is less about what
uses they (erroneously or not*) believe they could put it to, than with
research efforts being halted. This is also to some degree a concern
with nanotech, although in that case actual usage seems a larger risk
factor (and arguably, compared to AI nanotech has less potential for
private development -- at some point you need big labs, big equipment,
and visibility in the academic/scientific world (unless it's already a
government op), so shutting down the research seems harder).

* There are a number of approaches to general AI -- perhaps some may
  in their pre-human-level AI stages be subvertible.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT