Re: Existential Risk and Fermi's Paradox

From: Anne Corwin (
Date: Thu Jun 21 2007 - 11:42:24 MDT

Toby said:

"My point is these are all arbitrary, all of our most fundamental drives are just almost vestigial relics from our evolutionary upbringing."

Why does it matter where these "fundamental drives" come from? Would it somehow be better or less arbitrary if they were imposed upon us by some sort of external supermind? Does anything exist that *cannot* be somehow classified as arbitrary?

I'm not saying that examining and understanding the origins and development of various drives is uninteresting or pointless, but rather that "arbitrary" is something of a red-herring designation. If something turns out to be useful and/or interesting, why should it matter whether it is "merely vestigial" or not?

"Of course this AI could choose to be enthralled by anything, but why would it, if it knew it was all built on sand?"

For the same reason that clinically depressed people might take antidepressants if they start feeling a lack of interest in everyday or previously-pleasurable activities. Existence is full of stuff, and while you may wish to term this observation "arbitrary" as well, I think that conscious awareness and the ability to process information about all this "stuff" is, at the very least, a phenomenon of tremendous potential.

To suggest that a "chosen" sense of enthrallment is somehow "built on sand" is nonsensical, because it makes the (fallacious, in my mind) assumption that nothing is actually interesting -- that the universe is fundamentally boring if you are sufficiently smart or well-informed. Perhaps a property of advanced intelligence *is* the ability to recognize things that are complex and closer to being "objectively interesting" and tune one's motivational circuits to respond to these things.

Note also, though that I am not one who believes that personal survival (and therefore the "need" to come up with motivations to continue) necessarily needs to be incorporated as a "meta-goal" in order for an entity or AI not to destroy itself; if the AI lacks the desire to NOT exist, there is no reason to assume it would choose oblivion.

- Anne

"Like and equal are not the same thing at all!"
- Meg Murry, "A Wrinkle In Time"
Get the Yahoo! toolbar and be alerted to new email wherever you're surfing.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT