RE: Universe identity (was: Fighting UFAI)

From: pdugan (
Date: Thu Jul 14 2005 - 21:08:52 MDT

This echoes an idea I've had regarding inclusive Identity. The best way to
make an AI selfless would be to wire a self-concept which recursively includes
every known object as part of the overall self. If this process were
continuous to objects outside the included set, then the included set would
continue to grow, allowing the AI to identify not just explicitely with the
observable universe, but with the percieved limits of observable reality.

  Patrick Dugan

>===== Original Message From Joel Pitt <> =====
>Eliezer S. Yudkowsky wrote:
>> See this post: regarding the
>> observed evolutionary imperative toward the development of monoliths and
>> the prevention of, e.g., meiotic competition between genes. Replicating
>> hypercycles were assimilated into cells, cells assimilated into
>> organisms, etc. In Earth's evolutionary history there was a tremendous
>> advantage associated with suppressing internal competition in order to
>> externally compete more effectively; under your postulates I would
>> expect high-fidelity goal-system expanding monoliths to eat any
>> individual replicators, much as fish eat algae.
>Instead of considering the AI as a expanding monolith in itself,
>convince it (hardwire it) to think that it *is* the universe. Thus it
>will be interested in supressing internal competition and assuming it is
>designed so that anything destructive that occurs in the universe causes
>it pain or discomfort - including the increase of entropy. Again with
>any simple idea like this there are considerations, such as it
>preventing us from self-improving ourselves and using more energy -
>since it might percieve that as a form of cancer - but such things are
>likely to be of minimal discomfort unless a particularly selfish
>transhuman decided to expand their power sphere alot more than anyone else.
>This all leads to bad things(TM) if we start to consider multiple
>universes competing via natural selection though - since an input of
>energy would be needed to prevent the entropy increase of one universe
>and assumedly the AI would have a persistent itch to do something about
>it if possible.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT