Re: Another Take on the Fermi Paradox

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Dec 24 2002 - 18:40:18 MST


Brian Atkins wrote:
>
> Ok, but I find this argument unconvincing since a) building and
> launching the initial VNP should be relatively easy to do for such
> post-S minds b) we both expect (AFAIK) there to be a wide variety of
> post-S minds such that it is very unlikely that not a single one of them
> in any post-S civ decides to launch a VNP. One possibility I suppose is
> that the automatic outcome of any Singularity is _always_ that a single
> mind or "ruleset" takes control of that local space and _always_ decides
> to disallow VNPs or any other form of "civ making itself known to the
> rest of the galaxy". It does seem unlikely though that every single
> successful Singularity process has the same exact outputs.

Plus, however extremely unlikely it is that a civilization like *ours*
manages to permanently avoid a Singularity, I can imagine an
insect-derived hivemind managing to do so in, say, one out of a thousand
cases, or perhaps one out of a million. In which case one must then
explain why these proportionally rare yet still existent human-level
civilizations do not physically expand and take over the observed
universe, if all Singularities vanish into more convenient spaces.

The most plausible explanation I've heard so far is that one of the steps
in the evolution of complex biology and intelligence is enormously more
improbable than it seems, and that we are alone in this causal region.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT