From: Gordon Worley (firstname.lastname@example.org)
Date: Sun Nov 11 2001 - 12:33:29 MST
While writing the FAQ, I recently realized something that, while once I
saw it was pretty obvious, I guess I had not thought of before for some
We write a GIAI and then we let it start to ascend. Assuming it's
Friendly, at some point it may stop moving toward the Singularity. The
reason is that using its greater intellect it finds that the Singularity
is not the optimal path to a better state of existence (i.e. a more
Friendly existence), but something else is.
Now, based on the knowledge we have, I have a hard time imagining what
such a future might be, but then again before Vinge wrote about the
Singularity I doubt many people, if any, had drawn the same conclusions
as he did based on the same data that was available to everyone.
If your curious, my thought initially is that the Friendly AI my reason
that the Sysop or something like it won't work completely enough to
ensure that the universe doesn't come to an end post Singularity.
-- Gordon Worley `When I use a word,' Humpty Dumpty http://www.rbisland.cx/ said, `it means just what I choose email@example.com it to mean--neither more nor less.' PGP: 0xBBD3B003 --Lewis Carroll
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT