Re: Threats to the Singularity.

From: Samantha Atkins (samantha@objectent.com)
Date: Thu Jun 13 2002 - 18:26:26 MDT


Mike & Donna Deering wrote:

> In my personal opinion any kind of Singularity is a good Singularity,
> whether humanity is around to see it or not. And I also think that
> baring the total collapse of technological civilization the Singularity
> is inevitable.
>

Well, in my personal opinion and speaking plainly, disowning
one's species, literally all of humanity include one's self and
loved ones, for an unknown set of hopefully stable higher
intelligences is a very deep form of treason and betrays a
singular contempt for humanity that I find utterly apalling.

Why should human beings care for or participate in setting up a
Singularity if it has a large likelihood of being the end of
them all instead of rosier alternatives like uploading them?
Why shouldn't they fight the development of deployment of such
for their very lives?

 

>
> Biological weapons. This is it. This is the one that is going to cook
> our goose if we don't take drastic action now. There are at least 25
> countries developing bio weapons. Many bio agents have already been
> created that could destroy human civilization and more are being
> designed all over the world even as we speak. Someone somewhere is
> going to make a mistake and one of these is going to escape from a
> government lab someday soon. The technology for recombinant genetic
> engineering is rapidly climbing down the scale of general availability.
> If the government labs don't get us the garage experimenters or the
> terrorists will. There is also the possible though less likely
> accidental creation of a killer virus in one of the many university, or
> business labs. The perfect bio weapon would be a virus of maximum
> contagion such as flu, combined with maximum lethality such as marburg,
> clandestinely disseminated in a location of maximum dispersion such as
> an airport.
>

It is actually not a simple matter to design a bug that has the
right level of lethality at the right pace and disperse it
widely enough. If it is too virulent and lethal then it burns
itself out before it can spread. If it is too slow then
counter-measures will be more likely to be successful. It is
not so simple as just opening a jar of nasties in the middle of
an airport by far.

Nevertheless, the danger is real enough.

There is also a massive danger of governments and multinationals
going totalitarian and rigourously suppressing all reasearch and
development that does not bolster their power and their moneyed
interests.

 
>
>
> What to do? As much as I hate the idea, the best option I see is to
> outlaw privacy. But this does not seem to be practical in the present
> political environment. Alternatively we could treat bio viruses like

This is totally the wrong answer. A fully transparent society
only works in an environment of sane laws and sane makers and
enforcers of laws. We have neither. Full transparency would be
more helpful (both government/industry and individuals having no
privacy) than the one-way variety that is far more likely and
being speedily implemented with or without official policy.

In a society where the vast majority are relatively irrational,
unimaginative and lacking any great intelligence, full
transparency would mean that that majority (or those most adept
at its manipulation) would fully be able to control the
creative, highly intelligent minority from whom most of the
progress toward SIngularity will come.

> computer viruses, with firewalls and antiviral software. Firewalls
> would consist of each home being equivalent to a level 3 bio containment
> facility with hepa filters and decontamination air locks. When you went
> outside you would wear a racal hood with a virus filter and exhaust fan
> along with a virus impervious body suit. The anti-virus software would
> be by subscription to a daily antibody update. This would require the
> development of systems that could make a vaccine for a new virus in 24
> hours or less. We are not quite there yet.
>

How about we just grow a lot more sane human beings instead of
digging continuously for technological fixes that really aren't
fixes to the too often cussedness of local sentients? Replacing
them with something that is faster and arguably smarter but may
or may not be any more wise is not an answer. Scrapping
sentients is to be frowned upon even if you think you can and
even do create sentients that are arguably better along some
parameters.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT