From: James Higgins (email@example.com)
Date: Thu Jun 27 2002 - 02:28:18 MDT
At 11:08 PM 6/26/2002 -0600, Ben Goertzel wrote:
> > As miltary ethics are somewhat codified I believe a military AGI
> > may in fact be a
> > *more* desirable case for AGI development. Again this is said from the
> > viewpoint of one who trusts the US military and who supports our current
> > missions.
>Oh man this is a scary day of e-mails!!!
Yep, certainly is. This also illustrates my points, I believe. Although
even I would rather not see them illustrated quite so well in a single
day. Seeing CFAI applied in such a manor really makes me queasy.
>First we have Eli, who thinks that whomever can build an AGI is
>intrinsically possessed with tremendous wisdom...
>And now, this assertion that the US military should be trusted with AGI and
>My Singularity optimism is rapidly getting the jitters! Eugen, maybe you're
>right.... (Just kidding -- partly...)
>I don't doubt that there are many trustworthy, moral individuals within the
>US military. However, I do not trust the US military as an organization,
>no way Jose'. This is the crew that invaded Grenada... the organization
>that nuked Japan (probably both bombs were needless, but it's pretty damn
>clear the second one was), that systematically tortured Vietnamese women and
>children in the name of democracy and justice.... I'll spare you a full
>list of examples. Ever read the first-person account of how US soldiers
>chopped off the arms and legs of a Vietnamese woman, inserted dynamite in
>her vagina and blew her up? Not an outlier occurence. Excuse me if the
>impeccable morality of the US military seems a little questionable to me...
Ok, not even regarding these matters. The US military FOLLOWS
ORDERS. Those orders come, at the top, from very few people Some of
which may not have the morality and discipline you are attributing to the
military in any case. The president, who is elected primarily on a
popularity campaign, would have direct control over this SI. That is the
most worrisome idea I've heard in awhile. There is no way to guarantee
such power would be wielded properly, and we're talking about something
MUCH, MUCH more devastating than the nuclear arsenal. Not to mention that
the SI could be insidious, clandestine, smooth, or whatever else was
necessary. Hell, why not have the SI take control of the world markets so
the US could reap all the money? But, where should that money go? Well,
maybe the president happens to own a few companies that could use some
>Yeah, the US military has done some good (I didn't like the Taliban either).
>And some evil. And the people who run it are charged with protecting the
>interests of the USA (the military and ECONOMIC interests, as history amply
>shows), not with promoting the general good of the human race. If a
>Singularity is gonna lead to the greater general good for sentient life, but
>may lead to the dissolution of the US as an entity and the consequent
>meaninglessness of the US military and ranks like "general" and "admiral",
>do you think the Joint Chiefs are gonna go for it??? Hmmmm....
And there we have the biggest problem. The Singularty should be, and needs
to be, for everyone. Not just one government. If this were to occur we
could see SI warfare between governments. I horror at the thought.
> > My opinion is the opposite of yours in this regard. Of course I do not
> > want a radical islamic government creating an AGI whose unity of will is
> > with gun totting mullahs. Rather I want the US government
> > military/civilian research institutions with whom I am comfortable to
> > direct this effort.
>The Singularity is directly opposed to the national interests of any
>particular national government, because it will almost inevitably lead to a
>"revolution" that will make national boundaries meaningless.
I believe this is inevitable. Well, unless we end up in one of the
nightmare scenarios. Which I'd rather not think about today...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT