From: Christian L. (firstname.lastname@example.org)
Date: Fri Apr 06 2001 - 16:47:03 MDT
Arona Ndiaye wrote:
> >A 'Singularity for Dummies' sounds like a joke to me. Do not get me
> >but with all due respect: why should dummies need
> >to understand the Singularity ? More importantly with threats such as
> >Military-grade Nanotech etc... why should Eliezer (and the SingInst)
> >spend time on ANYTHING but what they're already busy with now ?
Chris Cooper wrote:
>This attitude is EXACTLY why it is important to explain these ideas to
>laypeople as we approach the Singularity.
I strongly disagree. The majority of Europeans are afraid of genetically
modified corn. You often hear statements like: "I don't want any genes in my
tomatoes!" and other idiotic claims. The anti-technology-movement is on the
rise, and even several scientists say that human cloning is evil.
All in all: most people are afraid of strange new technologies that threaten
old world views.
What we are planning on this list is to create a machine that will literally
take over the world.
Do you really expect the general population to like this idea? I think not.
We are talking about The End of the World As We Know It.
I expect about 99+ % of the population to be opposed.
In this context, "explaining these ideas to laypeople" is not very desirable
IMO. The less the general public knows, the more likely we are about to
reach Singularity. We don't need a huge number of followers. We only need a
few brilliant programmers.
The best course of action would be that we do not talk openly at all. We set
up a AI-company (without Singularity in the name) like Eli outlined in PtS,
and funnel "R&D"-money to seed AI -research. The only people that need to
now are the wealthy investors (Maybe not even them if the company is
However, with "media-horny" people like Moravec, de Garis and Kurzweil it
may be hard to keep the proverbial lid on. And, since SIAI has a public
website, I guess that the SIAI-people don't plan on going underground
The reason why we are not under attack by angry mobs at the moment is
1) Most people (the Media) don't know about it.
2) Those who do think we are crazy.
If the media starts to take the issue seriously, I think we are going to be
in big trouble.
Hopefully, we will maybe see a big debate about genetic modification of
humans that will keep the anti-tech movement off our backs. AI taking over
the world might be to far out for most people to consider.
>The kind of arrogant attitude that
>says, "Why should we be obligated to explain our important work to others,
>they won't understand it in the first place?", is elitist,
The question should be: "Why should we be obligated to explain our important
work to others, when that would likely make them want to kill us (and end
our work in the process)?"
You are free to call me an elitist, since I feel that this decision is best
made by the elite (us) than by the uninformed masses.
>and goes against the
>very concept of Friendliness that is VITALLY IMPORTANT to the >successful
>completion of this little endeavor.
Pragmatically speaking, it is only the AI that needs to be Friendly, not us.
However, not informing the public can be seen as an act of Friendliness,
since informing them might lead to no Singularity at all, which is the
absence of Friendliness.
>Making the general public aware of the very
>scary things that are possible as technology advances so quickly, (such as
>consequences of out-of-control nanotech) will only help us to reach the
What do you base this on? I feel that it would have the exact opposite
effect, as I have explained above.
>is supposed to help EVERYONE, not just those who got in line first.
Sure thing, they just don't have to know about it in advance.
>attitudes help no one in this instance, and ultimately may harm >everyone.
As I said above, it could help us, and therefore everyone.
Get Your Private, Free E-mail from MSN Hotmail at http://www.hotmail.com.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT