Re: Donaldson, Tegmark and AGI

From: Russell Wallace (russell.wallace@gmail.com)
Date: Fri Aug 11 2006 - 23:33:03 MDT


On 8/12/06, Brian Atkins <brian@posthuman.com> wrote:
>
> Hmm, interesting. It seems you've identified, and are experiencing, a
> relatively
> newly emerged example that may be one tiny member of a large class of
> psychological factors probably worth being added to Bostrom's existential
> risks
> work. Has he or anyone done any work on establishing whether particular
> belief
> systems/religions might actually increase (or decrease) existential risks?

Not that the phenomenon is new of course, but the study of it...

Probably not a very politically correct or easy subject to delve into, yet
> this
> is the same exact kind of factor that makes me uneasy having highly
> religious
> Presidents here in the USA who seem to privately believe "A coming
> apocalypse is
> an unavoidable part of future reality, and not a bad thing because some
> will go
> to Heaven."

...that's far enough off topic for this list that we're probably going to
get killthreaded if I answer here, so while I don't have a problem
discussing it, please reask either on some other forum such as extropy-chat
or private email if you'd like me to answer.

I worry that subconscious factors involved with faith-based
> reality-view belief systems could affect important decision making if one
> were
> not superhumanly vigilant in remaining unbiased. Even if it shifts your
> priorities by a few percentage points, it might be important. Maybe you
> don't
> put in those extra 15 minutes a day on your work

And thereby have that little margin of time to get sane, get your head out
of the rut (see Brooks 'The Mythical Man-Month' and McConnell 'Rapid
Development'), and realize how you were headed in completely the wrong
direction, and pull back and go in the right direction, and accomplish
something.

Point: These things don't necessarily go in the direction intuition says
they ought to go in, even for writing Excel, let alone AGI.

The key problem I see with your particular belief system is that rather than
> providing you with access to important new realizations through acceptance
> of
> high likelihood of local failure, it could instead lead you astray by
> providing
> in the end what looks like yet another form of as you say "a need to deny
> the
> possibility of failure". Except rather than outright denying it up front,
> you're
> just moving things around slightly and saying failure in this and many
> other
> localities is ok as long as not everywhere fails - you still apparently
> have the
> need in the end to deny to yourself that total failure could occur. So are
> you
> really the ultimate pessimist, or are you slyly giving yourself a crutch?

Neither: I'm _openly_ explaining how I accidentally came by a useful crutch
and what the result was. Consider that (real, not metaphorical) crutches
were invented by some bright spark long enough ago that we don't remember
it. Did said inventor go "well, obviously it's a good idea for people to
hobble around awkwardly on their arms so their leg muscles will atrophy"?
Obviously not: he went, "well there are circumstances in which people's legs
just won't carry them, here's something to help". Do you think the human
mind is just plain capable of being perfectly rational in any and all
circumstances whatsoever, like the way Superman can swim through lava? If
so, please point me to a single example among all Earth's six billion.

The _ultimate_ pessimists are here: http://www.vhemt.org/

I'm a pragmatic pessimist. I'm a man with a job to do.

The possibilities you raise would be issues indeed, if they materialized.
But look around you, use the empirical method. Who in these parts is
advocating collective suicide for Earth-descended life, by slowing down the
research that could save us? Not me.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT