From: David Picon Alvarez (email@example.com)
Date: Fri Dec 09 2005 - 14:23:23 MST
From: "Phillip Huggan" <firstname.lastname@example.org>
> David, you are smart enough to handle MNT in a way that doesn't create
a new extinction risk. Don't bomb cities with it and you are set. It is
the other people with MNT that create the risk. The risks are not that
different from well-known military ambitions. A MNTed missile defense and
MNTed missiles delivering nukes/neutron bombs/EMPs, that is the risk posed
by irresponsible MNT administrators. The solution to UFAI and MNT risks is
identical: don't allow hostile forms of these development programs to occur
(I don't think defenses to an AGI attack are possible until we have a much
greater appreciation of extreme physics). Only the time-scales and odds for
success differ in whether the solution is carried out by FAI, an Oracle, or
plain old people with MNT.
Thanks for your confidence :-) but even if I were smart enough not to use
MNT as a weapon, I'm certainly not smart enough to prevent other people from
doing so. More to the point, I'm not sure I'm smart enough to know what uses
of MNT are sensible and sufficiently unlikely to create existential risks.
I see why you'd be adverse to an AI in any way telling you what to do, but I
think this will be an important part of an SAI, telling us what we should do
to get what we really want, versus what we think we want, etc.
This archive was generated by hypermail 2.1.5 : Sat May 18 2013 - 04:00:48 MDT