Re: Military in or out?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Feb 25 2001 - 20:08:47 MST


Brian Phillips wrote:
>
> I can however assure you that UNLESS
> your commentary on the difficulties inherent in constructing near-human
> and transhuman AI are utterly spurious
> then the Military/Industrial complex will NEVER cook up enough novelty to
> create superintelligent AI. They might "buy" it but would never be able
> to "build" it!

A pleasant-sounding theory which is blatantly untrue. The NSA in
particular has some of the finest research minds in the known universe. I
refuse to rely on their incompetence. Besides, we're still screwed if
they decide to take over or shut down existing AI projects.

(There are people out there who *still* believe in the Military-Industrial
Complex? I mean, I accept that pre-WWI Germany had a powerful and amoral
MIC, but nowadays that's just a Sixties concept, like the Establishment.)

> But the theory here is that you have to be absolutely sure the sword will
> not
> turn on you before you let it loose. Militaryfolks are paid to be paranoid.
> If an superintelligent AI could not be utterly guaranteed to be loyal it
> would not be funded.

That is the disaster scenario for military development - research minds
smart enough to develop AI and superior officers dumb enough to screw up
Friendliness.

> ----- Original Message -----

Brian Phillips, please don't quote the entire message in replies.
You Have Been Warned. Also, if possible, please fix your line breaks.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT