Re: Donaldson, Tegmark and AGI

From: Brian Atkins (
Date: Fri Aug 11 2006 - 23:00:52 MDT

"So finally I was able to understand that if I believe this is _true_ - which
I do - then I can accept a 99% probability that we will fail..."

Hmm, interesting. It seems you've identified, and are experiencing, a relatively
newly emerged example that may be one tiny member of a large class of
psychological factors probably worth being added to Bostrom's existential risks
work. Has he or anyone done any work on establishing whether particular belief
systems/religions might actually increase (or decrease) existential risks?

Probably not a very politically correct or easy subject to delve into, yet this
is the same exact kind of factor that makes me uneasy having highly religious
Presidents here in the USA who seem to privately believe "A coming apocalypse is
an unavoidable part of future reality, and not a bad thing because some will go
to Heaven." I worry that subconscious factors involved with faith-based
reality-view belief systems could affect important decision making if one were
not superhumanly vigilant in remaining unbiased. Even if it shifts your
priorities by a few percentage points, it might be important. Maybe you don't
put in those extra 15 minutes a day on your work, or you take long vacations at
your Texas ranch instead of learning more, and you miss something important, or
your world-improving AGI is less optimized and a year later than it could have
been so another 200k people die. Etc.

The key problem I see with your particular belief system is that rather than
providing you with access to important new realizations through acceptance of
high likelihood of local failure, it could instead lead you astray by providing
in the end what looks like yet another form of as you say "a need to deny the
possibility of failure". Except rather than outright denying it up front, you're
just moving things around slightly and saying failure in this and many other
localities is ok as long as not everywhere fails - you still apparently have the
need in the end to deny to yourself that total failure could occur. So are you
really the ultimate pessimist, or are you slyly giving yourself a crutch? If
your currently unprovable belief system turns out to be wrong, and if it has had
any then-negative influence upon your cumulative decision-making throughout your
life up until that point, then that is a potentially large oops.

Brian Atkins
Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT