Re: AGI funding (was Re: Some bad news)

From: Slawomir Paliwoda (
Date: Fri Nov 08 2002 - 23:35:08 MST

> Slawomir Paliwoda wrote:
> >
> > Yes, in order to create a message that would resonate with people, one
> > to put a good spin on Singularity. Instead of talking about Jupiter
> > Transition Guides, Singularitarianism, or boring AI technical details
> > might just as well say something like "FAI could help to cure cancer and
> > aids", "FAI would be helpful in figuring out your material problems", or
> > that "FAI could save people from dying". That's basically how the
> > nanotechnology gets "sold" to the masses. And since FAI could do
> > well........ you get the idea. If anybody is interested enough, the FAI
> > websites would provide the further details.
> Dumbing down the Singularity for "the masses"... is that really what an
> FAI would do?

No, but why do you have to do what FAI would do? This has nothing to do with
programming FAI. This is about getting the funds to start building FAI in
the first place. Or at least I think so.

Putting a spin on Singularity doesn't mean dumbing it for the masses at all.
The meaning of "spin" in this case is being very careful with how much shock
is applied to your audience. And the shock applied not to masses, but to
few people with the ability to support the project. Any mass movement is
both unrealistic and unnecessary.

> It doesn't seem very compassionate. Or very honest. Or
> very effective.

Scaring people is unnecessary and definitely not more effective. Providing
less, but still true, information doesn't make your message a lie.

> I have seen newbie Singularitarians try to spin the
> Singularity for what they fondly imagine to be the lowest common
> denominator, and I have never once seen that trick work. All that happens
> is that the real message - the message that originally produced the
> Singularitarian - is lost out in a morass of unconnected technical details
> and unsupported noncredible assertions.

I'm not sure what you mean by that.

> If you're saying things that you
> genuinely believe, somewhere in there will be things that even other
> people may find to be worth believing. If you're trying to be
> manipulative, saying things you think are real clever and that will really
> pander to those stupid masses, you will make amateurish manipulative
> statements that will instantly be filtered out by an audience trained to
> resist the highly experienced manipulation of the trained professionals.

Well, religion seems to be pretty successful at implementing those tactics.
Actually, people will believe in anything. But why would you have to
manipulate people to get what you want in the first place? The right, but
true, message will do. This is not a religion or politics where you need to
know how to manipulate people. Masses are not necessary either.

> Is the solution to get more practice at manipulating people? No. That
> high a competence at manipulation takes too much evil. There's no way
> those skills could be employed at that level while keeping the essential
> message of the Singularity intact. Remember that we are not here to make
> ourselves famous. We are here for the Singularity. If you make yourself
> famous and lose the Singularity that is anti-progress.

I'm not sure whether Singularity contains any message or that this message,
if it exists, needs to remain intact for some reason. It's just merely a
word trying to describe the series of events due to accelerated progress of
technology. Like any other event it has no message. Singularity itself is
not the goal. Well being of humanity faced with inevitable Singularity is.
Humanity doesn't need Singularity, or superficial theories designed around
it. Again, this is not a religion or politics. All efforts trying to protect
that theory are just waste of time that could be spent on trying to
accomplish something of real value to humanity. These efforts would be a
real anti-progress.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT