From: Peter de Blanc (peter.deblanc@verizon.net)
Date: Fri Feb 03 2006 - 11:26:16 MST
On Fri, 2006-02-03 at 08:42 -0800, Jeff Herrlich wrote:
> As a fallback strategy, the first *apparently* friendly AGI
> should be duplicated as quickly as possible. Although the first AGI
> may appear friendly or benign, it may not actually be so (obviously),
> and may be patiently waiting until adequate power and control have
> been acquired. If it is not friendly and is concerned only with its
> own survival, the existence of other comparably powerful AGIs could
> somewhat alter the strategic field in favor of the survival of at
> least some humans.
You're assuming an observer-centric goal system (and no, that still
wouldn't help us - why would it?).
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT