From: Eugen Leitl (email@example.com)
Date: Sat Jun 29 2002 - 04:11:09 MDT
On Fri, 28 Jun 2002, James Higgins wrote:
> This appears to state that you believe that kicking off The
> Singularity is intrinsically evil. Is this a correct interpretation
> of your statement?
No. I believe that all flavours of Singularity in which we people don't
fall off the bus are good. However, I have objective reasons to believe
that all brands of Singularity with very hard edges right on the onset are
hostile to organic life in general, and not just to people. It is hard to
assess how hard it is to engineer a seed to make a hard-edged early
takeoff, but given the most probable outcome I think it's a Really Dumb
Idea to try.
> Another things that is starting to appear hopeless is your belief in
> regulation of the Singularity.
I don't enjoy pointing this out, but you'll be surprised how hard it is to
be an outlaw on the run in an extremely transparent, high-surveillance
society. We're increasingly moving towards law & order rather than
Brinworld/cryptoanarchy, and it is obvious that technology gives an edge
to centralistic gooberment agencies than to largely clueless, apathetic
agents who don't understand the power of synchronized action. It is not
obvious that the latter is changing, in fact so far people are buying the
line hook and sinker.
I think the majority of people here are extremely overoptimistic at the
threshold of resources required to kick off a successful attempt, and
socities *do* change very rapidly now, and not necessarily towards greater
> Discussing regulation of *any* up-and-coming technology is pointless.
Nope. The big potential killer technologies are very few, and there are
distinct routes which can be blocked. Sure there are going to be
surprises, but if you keep track of things occuring they're going to be
quite few of them. If there's military AI research making the goverments
paranoid spy on each other is here clearly a Good Thing.
> New technologies move, adapt and change much too quickly for
> legislatures to deal with. Only after a technology has matured (is
Once again, you do not realize what it means to be classifed as armed,
dangerous, and on the run. Please do not point me towards the current,
very lenient low-tech socities as a demonstration that individuals and
groups of individuals engaged in low-profile activity can indefinitely
maintain their cover. As AI researchers you surely realize what it means
if cash is outlawed, and the entire technosphere spies on you, using data
warehousing methods to look for patterns.
So far only bioterrorism and nuclear terrorism (trivial threats, as far as
Singularity tech is concerned) are on lawmaker's and LEO's radar. As soon
as AI is classified a credible threat the crackdown can be swift and hard.
If the threat is considered very credible, this has the power to transform
societies. (I hope you'll enjoy your anal probe).
There's a window of free operation before that, but I think a number of AI
people here are being very, very optimistic on the time scale. If you can
pull it off before, and it doesn't kill us, the point is moot. If you
can't, well, see above.
> substantially deployed) can it be regulated. I have provided
Regulations so far worked for weapons of mass destruction. Hard takeoff
Singularity seeds can be seen as slow-starting but Armageddon class of
weapons. The threshold to generate a viable seed is high, so arguments
about brilliant teenage hackers operating from their bedrooms will be met
with polite derision.
> examples of my view. I have not seen ANY information in support of
> your beliefs, however. Please provide us with a number of concrete
I don't have to provide jack. I see a number of people talking deep
engineering hubris in face of the mothers of high-complexity efforts with
the highest impact imaginable. The usual RISKS thing, with the stakes
being more than a bit higher than usual. As I said, it's pointless trying
to argue with people who're deep in denial. It's self-selected, if they
didn't have the specific agnosia they wouldn't be doing this thing in the
> examples where effective regulations to govern a technology were
> passed prior to the release or deployment of the technology. Note
> that technologies created by government agencies would be
> inappropriate and poor examples.
Nope, technology is technology.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT