From: Rolf Nelson (firstname.lastname@example.org)
Date: Sat Jan 26 2008 - 12:29:44 MST
>Most people should realize that they don't have the skills to improve on
past AGI efforts, and should devote their resources to existential risks
which are easier to understand, such as killer asteroids.
Peter, overconfidence is indeed an ongoing risk with this venture (as,
indeed, it is with any venture, especially one that is attempting to build a
new technology). In general, all things equal, simple solutions should be
preferred to complex solutions.
However, the ratio between AGI existential risk and killer-asteroid risk in
this century has got to be on the order of one to a million!* Despite this,
I would estimate asteroid-impact overall commands more resources than FAI
does.** I don't know how much you propose Bayesian shifting for
overconfidence, but surely it's not a shift of that magnitude.
Perhaps my own conclusions differs from yours as follows: first of all, I
have confidence in the abilities of the current FAI community; and second of
all, if I didn't have confidence, I would try to bring about the creation of
a new community, or bring about improvements of the existing community,
rather than abandon FAI for SpaceGuard-type programs.
* This risk may go up by some unknown-to-me amount if you consider knock-on
effects from an otherwise non-existential impact, but I doubt this
consideration changes my overall analysis.
** This is a vague statement, I'd be interested if anyone has figures on
total resources available to SpaceGuard-type programs so that I could pin
this down better.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT