From: Gissur Ţórhallsson (email@example.com)
Date: Fri Nov 02 2007 - 17:11:32 MDT
> you only need a 1 in 10^20 chance that a planet would evolve intelligent
life to explain our presence
But that is exactly the point, we're not debating our own presence, we're
debating why we can't see evidence other presences, why can't we see any of
advanced stellar technologies - and one idea is that, even though life may
be highly common, and that intelligence may be fairly common (substitute
your own magnifier at will) - it seems to be the cold hard fact at this
stage that galaxy spanning civilizations are NOT common, if they exist at
So somewhere, assuming a high probability of intelligence evolving,
intelligence falters - and hence we have The Great Silence. But what I was
getting at, is that The Great Silence might be explained by the fact that an
AGI wouldn't NEED to colonize the cosmos:
0) It's not driven by the same impulse to reproduce as we are.
1) It can observe quite a lot from here.
2) All the laws of physics are the same as anywhere else, so it can figure
them out just fine without ever having to leave its armchair (so to speak).
3) Once it has all them pesky laws figured out, it can go ahead and simulate
the universe, thus making traveling pretty pointless.
4) If, having garnered all it can from staying put, it were inclined to
travel - it could do so much more inconspicuously than an stellar explosion
of biomass, which Hanson's model seems to imply.
So we might have a framework in which the Singularity gives us loophole
through Fermi's Paradox.
As for the rate of intelligence, I'm gonna go out on a limb here and say
that I think that there is a lower limit of scale somewhere, where an
intelligence becomes too slow to be able to respond to its environment in
any meaningful way rendering an attribution of intelligence somewhat
suspect, but then again, if something were to communicate at a bit per
millennium - communicating with it would not so much depend on the speed of
thought, but rather of the attention span of the faster intelligence - I
feel I'm stepping on some thin ice here, so I'm gonna leave it at that.
On 11/2/07, Matt Mahoney <firstname.lastname@example.org> wrote:
> For intelligence that thinks on our time scale or faster, the speed of
> is a barrier to intergalactic travel. There may be intelligences that
> on much slower time scales, but then communication would be a
> barrier. They
> may see us, but we can't see them because they communicate at the rate of
> bit per millennium.
> There could also be many great barriers. We don't know. Invoking the
> anthropic principle, the universe is big enough so that you only need a 1
> 10^20 chance that a planet would evolve intelligent life to explain our
> --- Gissur Ţórhallsson <email@example.com> wrote:
> > Hi sl4.
> > (Short self-deprecating 1st post intro: My name is Gissur and I'm
> > from Iceland, I've been lurking for a while, but I thought I'd finally
> > contribute (even if it's just a glorified link).)
> > Well - seeing as how point number 5 is beyond the singularity - we're
> > free to speculate.
> > Robin Hanson tackles this point from a slightly different perspective in
> > paper; The Great Filter - Are We Almost Past
> > It?<http://hanson.gmu.edu/greatfilter.html>.
> > He argues that the evolution of mankind consists of numerous discreet
> > which he describes a "best-guess evolutionary path to an explosion which
> > leads to visible colonization of most of the visible universe". He then
> > on to discuss that each of these steps could constitute The Great
> Filter, an
> > obstacle so great, that evolution has yet to tackle it, thus explaining
> > Great Silence.
> > This of course raises the Very Important Question: Are we past the Great
> > Filter? Because if we aren't we'd better look out.
> > This also raises the question whether an AGI would be as interested in
> > Extraterrestrials as we humans seem to be, because if I we CAN find
> > for it NOT to be, we might have found a plausible way to sidestep the
> > Great Filter/Great Silence issue - namely, that once an AGI reaches a
> > certain level of processing power (however this power comes to be) it
> > doesn't really need to go anywhere.
> > Assuming an AGI is primarily an infovore, I'd think that it would be
> > reasonable to say that it would want to know everything it can know,
> > risking interstellar exploration, and given how we like to think the
> laws of
> > physics work the same way everywhere in the universe, it could probably
> > a lot out just by staying. Maybe it'd need to build a dyson sphere,
> > not. Maybe it would find some source of energy unknown to us, anyway -
> > I'm basically saying is that the constant of colonization, which we
> > to all life, need not apply to an AGI, since a lot of the principles
> > drive OUR exploration don't necessarily apply (namely competition and
> > scarcity of resources).
> > I'm sorry if this has all been covered before, and I also apologize if
> > raving.
> > Gissur
> > On 11/2/07, Matt Mahoney < firstname.lastname@example.org> wrote:
> > >
> > > 1. Self replicating RNA, about 3 billion years ago. Single strand RNA
> > >
> > > fold itself into complex shapes. Sexual reproduction might have
> > > in
> > > the form of combining pieces of molecules to make new ones.
> > >
> > > 2. DNA based life, separating data from function (protein). Major
> > > innovations
> > > include error correction (the double strand provides redundancy), mRNA
> > > amplification of protein synthesis, and gene regulation leading to
> > > multicellular organisms with adaptive subsystems such as the immune
> > > and
> > > nervous system about a billion years ago.
> > >
> > > 3. Language, maybe 50,000 to 10,000 years ago, leading to memetic
> > > evolution
> > > (cultural rules favoring reproduction), collective intelligence and
> > > accumulation of written knowledge and technology.
> > >
> > > 4. Self replicating intelligent machines, maybe later this century.
> > >
> > > 5. What? I have no idea, but history suggests it will happen very
> > > quickly.
> > >
> > >
> > > -- Matt Mahoney, email@example.com
> > >
> > --
> > Gissur
> -- Matt Mahoney, firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT