From: Joel Pitt (email@example.com)
Date: Fri Jul 15 2005 - 23:11:50 MDT
Peter de Blanc wrote:
> IMO, any AGI which is not explicitly Friendly should be considered
> Unfriendly. Even if you successfully engineer an AGI which decides to
> leave humans alone, this technology can be turned into dangerous UFAI
> far more easily than it can be turned into FAI, which needs to be
> designed Friendly from the beginning.
What is the reasoning behind being able to turn ve to UFAI more easily?
Perhaps ve just decided that the best course for humans is for them to
learn for themselves, and was going to sit back and observe all of
existance (assuming it had already worked out that humanity could
survive on its own).
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT