RE: The Taoist Transhuman

From: pdugan (pdugan@vt.edu)
Date: Sat Jul 16 2005 - 01:58:52 MDT


>===== Original Message From Joel Pitt <joel.pitt@gmail.com> =====
>Peter de Blanc wrote:
>> IMO, any AGI which is not explicitly Friendly should be considered
>> Unfriendly. Even if you successfully engineer an AGI which decides to
>> leave humans alone, this technology can be turned into dangerous UFAI
>> far more easily than it can be turned into FAI, which needs to be
>> designed Friendly from the beginning.
>
>What is the reasoning behind being able to turn ve to UFAI more easily?
>
>Perhaps ve just decided that the best course for humans is for them to
>learn for themselves, and was going to sit back and observe all of
>existance (assuming it had already worked out that humanity could
>survive on its own).
>
>Joel

Agreed! Thats another major principle we should consider: would transcendance
given to us on a silver platter really be worth while? Maybe a mellow FAI is a
more self-improvement survivable sort of design sturcture.

    P



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT