From: Jeff Herrlich (jeff_herrlich@yahoo.com)
Date: Mon Apr 16 2007 - 12:21:32 MDT
Hi Kevin,
The odds may be so heavily stacked against us that the probability of success is only 0.0000000000001% for any given civilization (or worse). That doesn't mean that we can't possibly be that one civilization. And it doesn't mean we shouldn't try. What if the goal is possible (albeit very remotely possible) but all civilizations decide to give up prematurely. That would ultimately make this an entirely pointless Universe.
I'm starting to believe more and more, that a very large "fraction" of the paradox, is that an evolved intelligence like us is simply extremely rare in this Universe.
Best Wishes,
Jeffrey Herrlich
"kevin.osborne" <kevin.osborne@gmail.com> wrote:
recipe for: a little future shock moment.
proposition: take the Anissimov/Yudkowsky view on the seriousness of
Friendly A.I and other existential risks as a given.
empirical observation: as per Fermi. the aether is silent and
lifeless. all other intelligent species on all other near-space
life-supporting worlds have failed to reach uplift.
theory: the galaxy is dead and void. existential risk has proven
lethal and/or progress-suppressive in all prior cases.
prediction: our chances of reaching/surpassing/outliving the
Singularity are negligible -> nil.
---------------------------------
Ahhh...imagining that irresistible "new car" smell?
Check outnew cars at Yahoo! Autos.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT