From: pdugan (pdugan@vt.edu)
Date: Fri Jul 15 2005 - 09:25:27 MDT
An explicit association of self to universe would probably be useless, not to
mention anthropomorphic. However, were the AI's supergoal to assure optimal
growth, freedom, happiness (or whatever) for a set of entities to which an
identification could be assumed as a functional metaphore, and that set
continued to grow recursively as the AI's knowledge grew, then we'd have an
ambassador for whatever Matroishka brains are hanging out in Andromeda, or
whatever machine elves exist in other universes, not to mention a fairly
survivable Friendliness dynamic for the rest of us. Of course a mindless AI
designed under a sort of universal indentity principle might just assimilate
everything it finds, I think of universal identity as more of a design
hueristic than a cognitive hard-code.
>===== Original Message From Peter de Blanc <peter.deblanc@verizon.net> =====
>If an AGI has a concept called 'self' which means 'the universe,' why
>would this affect its goal system at all?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT