Re: Fermi and LOGI

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Apr 24 2002 - 14:28:17 MDT


Eugen Leitl wrote:
>
> If we will have evidence of nonexpansive spacefaring others, we'll know
> that Darwin-driven expansion is avoidable (the only mechanism I can think
> of is singleton control, and singleton scenarios have very large
> problems). In absence of hard data, we must assume expansion.

Also, please note that FAI-based singletons will expand to increase the
living space of their inhabitants, or (if this is technologically
unnecessary) send out scouts to see if there are other intelligent citizens
that need help. The Culture has Orbitals, but it still has a Contact
section. Nonexpansion requires a singleton, but a singleton does not imply
nonexpansion.

> Yes, I thought that's what I said. With the exception that exploration and
> expansion are the same things, and metabolism of an advanced culture has
> signatures detectable over mega light years (large patches of colonized
> substrate, giga light years, but you can't observe these because deep
> field is equivalent to young universe). Stealthy exploration=being nice.

Our *current extrapolation* of a civilization's metabolism involves
signatures that are detectable across interstellar distances; it could be
that baryons are too massive and slow to shuffle around or even use as fuel,
and that real computronium doesn't live in any real estate we know about.
So you might think that a nonfriendly singleton (whether upload or AI) might
not be detectable or expanding. However, if a nonfriendly singleton doesn't
need quaint old-fashioned "matter" and consequently vanishes into Elysium,
then presumably it wouldn't need to eat its originating civilization
either. In this case the originating civilization would go on spawning
superintelligences until it spawned one whose motivations led it to send out
scouts, whether to aid other intelligent life or destroy it. If no possible
superintelligence has such motivations, then eventually biological life from
that civilization would expand across the universe, irrespective of any
created and vanished superintelligences. So you still end up with a Fermi
Paradox.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT