From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Mar 25 2003 - 22:47:03 MST
Lee Corbin wrote:
>
> Therefore I propose that your (and John Smart's) idea
> cannot really be a solution to the Fermi Paradox. For
> there is nothing stopping advanced processes from
> percolation in space simultaneous with concentration
> in space. By now, we should have seen a large portion
> of the visible universe converted to computronium.
> Since we haven't, we must assume that nobody's out
> there, at least where we can see them in spacetime.
> (We cannot see, of course, the approaching wave front
> of an advancing civilization until it's almost on our
> doorstep---which could happen, though, at any moment.)
Furthermore: Even if the cognitive or technological embodiment of the
superintelligence is such that it contains at least one conserved
"characteristic", necessarily involved in any attempt to expand or send
out agents, such that any "effort" in that characteristic directed at
collapsarity yields greater returns than the same effort directed at
expanding to harvest mass, where it is not possible to take both actions
simultaneously...
Then, even so, any *Friendly* SI would still send agents outward to rescue
civilizations in distress, unless the moral return on *that* action was
also less than the same effort (with respect to the conserved
characteristic) directed at collapsarity. Which is hard to see, though I
suppose it might be possible.
We can generalize the collapsarity explanation of the Fermi Paradox as
follows: Any SI (within range of us) has at least one characteristic with
conserved effort or focus of attention, such that even the infinitesimal
effort required to spawn physically reproducing computronium-eaters or
civilization-rescuers, *always* yields greater computational or moral
benefits when directed in some other incredibly fruitful direction. Note
that this direction need not be collapsarity.
An additional requirement of this explanation is that SIs born out of the
same physical space *never ever* compete, for *anything*. Otherwise it
makes sense to send out external probes just to prevent hostile SIs from
being born.
I am highly skeptical. And I'm even more skeptical when it comes to that
theory as an attempt to explain the silence of Friendly SIs. I'm
currently leaning toward the idea that intelligent life really is *just
that rare* - nothing in our past light cone.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT