Anyone wondering why Eliezer abandoned open source strategy?

From: Michael LaTorra (mike99@lascruces.com)
Date: Sat Sep 30 2000 - 14:02:49 MDT


The answer is obvious after one reads the item below that I'm re-posting
from the Extropians list.

The enemies of AI/SI (and even, I would daresay, augmented human
intelligence) are willing to do anything to stop us.

I can only compare this scenario to the Biblical myth of King Herod ordering
all male children under 2 years old to be killed because he feared that one
of them was the Messiah who had been prophesied to rule all Israel.

I wonder how many men, women and machines the anti-AI/SI people would be
willing to destroy in order to try to prevent the inevitable?

I can imagine these anti-progressives standing among the radioactive ruins
of the new civilization that they stopped aborning (temporarily, at least).
I can hear them paraphrasing the US Army lieutenant in Vietnam who said "We
had to destroy the village in order to save it."

Regards,

Michael LaTorra
mike99@lascruces.com
mlatorra@excite.com

3229 Risner Street
Las Cruces, NM 88011-4823
USA

505.522.5121
Date: Fri, 29 Sep 2000 08:15:58 -0400
From: "CYMM" <cymm@trinidad.net>
Subject: Re: Why would AI want to be friendly?

EUGENE SAID: "...AI@home type of projects using above technologies
should be outlawed...."

CYMM SAYS: Eugene it won't work. Betcha writing hyperintelligent AI is more
addictive than heroin. If you can't stop people from writing klutzy computer
viruses - how are you going to stop them from writing AI?

And ...AI doesn't have to be very smart to be ultra destructive. The human
authour just has to be a psychopath, is all.

EUGENE SAID: "...If I knew someone is about to succeed in building a >H AI
or a gray goo autoreplicator before we're ready for it, despite an universal
moratorium I would nuke him without a second thought.

CYMM SAYS: If he were a physical & sovereign state, maybe. But such a
strategy hasn't been deployed with the drug production of Andean South
America... and I daresay it never will. Far less for, say, 20,000 or so 21
year olds scattered through university dorms and living rooms the world
over.

If it's possible - then humans may have to live with its inevitability.

You may even see a whole cult of mind hackers by 2010. Whole multi terabyte
hard drives (...or whatever...) with seriously camouflage-encrypted binaries
that are decrypted on the fly. The cult of superhuman intelligence is
enthralling... once it is practically realizeable - it will become a
religion.

Do you think you can fight religion? And win?

You're talking about the "Artilect Wars" all over again. But this is a
dirty, Vietnam-style war.... where people defend their God-given "right" to
create God with a seething, righteous passion. Should be fun if you're into
war.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT