Threats (was Theoretical question for the list: publicity?)

From: Arctic Fox (arctic.fox@ukgateway.net)
Date: Sun Apr 08 2001 - 08:35:07 MDT


At 14:47 08/04/01, Dale Johnstone wrote:
>Brian Atkins wrote:
> >To the list I ask: if for instance Wired magazine wanted to do a large
> >article about SIAI in the near future, complete with cover image of
> >Eliezer with a quote "This 21 year old cognitive scientist is building an
> >AI that will end the world as we know it" do you think that would accelerate
> >our plans or hurt them? Assume that besides talking about Eliezer and our
> >plans, it also presents FAI as the answer to Bill Joy's AI concerns.
>
>As much as I'm in favour of openness, you're not going to get an informed
>'debate' in the media. Frankly, you're not going to get any kind of debate
>much further outside of lists like this one. I would love to live in a
>society whereby people could raise issues of the day, and everyone would
>have their say, and we'd come to some sort of sensible conclusion and do
>the right thing - but sadly that's not how it happens. An article like
>that (if taken seriously) would lead to a polarization of opinion, most of
>which would be against us, and possible exposure to extremist groups. In
>general, AI is not perceived as harmful or in anyway threatening and I'd
>like it to stay that way.

Pehaps Brian and Eliezer could give us some background on this. Has there
been any threats or communications from extremist or religous groups etc
regarding the Singularity Institute so far? Or is SL4 one level too high
for them to comprehend and take seriously?

If the research (i.e. AI coding) had to go underground how difficult would
that be to do? I presume there won't be too much physical equipment
involved (unlike, say, a cloning laboratory) so would it be possible to
relocate and for the group to split up and share data securely over the
internet?

Paul



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT