Re[2]: continuity of self [ was META: Sept. 11 and Singularity]

From: Cliff Stabbert (
Date: Sun Sep 15 2002 - 17:08:49 MDT

Sunday, September 15, 2002, 11:01:22 AM, Ben Goertzel wrote:

SA> What if you could get enough people en masse to step beyond
SA> scarcity thinking and begin thinking in terms of abundance and
SA> maximal development and well-being for everyone? It would be
SA> extremely and wonderfully powerful. It could turn the tide of a
SA> lot of dystopias and impending disasters.

BG> It would be great, but, that seems to me to be an order of
BG> magnitude harder problem than creating real AI !!!!!!
BG> I'm afraid that humans aren't wired for the kind of "advanced
BG> consciousness" you're advocating, so that getting more than a few
BG> statistical outliers in neural space to think this way, is a hell
BG> of an uphill climb.

I'm not too sure that it's all that difficult to convince people,
*given the right tools*. IMO most reasonably intelligent folks can be
at the least partially persuaded by the arguments and inventions of
Buckminster Fuller (such as the World Game), for example. Robert
Anton Wilson's speculations on "The RICH Economy" in his novel _The
Schrödinger's Cat Trilogy_ are pretty persuasive (if simplistic) as
well, as far as I'm concerned. (You can search for "RICH Economy" to
get to the relevant bits in the unauthorized online text of the novel
at )

I don't see the problem being, in the end, convincing most people.
*That* part we can do an end-run around: if you give them abundance,
the need to convince them of its possibility evaporates.

What I see as the largest problem is the people and institutions who
have the most power in this world: those who run things (aka "The Man"
or "Them"). *Their* scarcity- and power-over-others based thinking is
IMO the gravest danger to achieving any even reasonably equitable
results from nanotech, AI and similar developments. The belief that
in order to enjoy their current 'high' positions, others (other
countries, other people) must be kept 'low' has been ingrained over
millenia in the aristocracy -- to the point where some institutions
and individuals have become, for lack of a better word, "addicted" to
power as an end in itself.

How big would the temptation be for any current superpower to grab the
first workable nanotech or the first usable general AI and use it to
wield power over others? The type of people who freak out over the
idea that the masses can protect their privacy with 128-bit encryption
are *not* the folks I'd trust to use nanotech or AI to "advantage all
without disadvantaging any" (in Buckminster Fuller's terminology),
even though such technologies clearly can do so. I also wouldn't
trust them to *not* swoop down out of the sky and take over your
lab/company/garage when you've made enough progress to be useful to
them, but well before you achieve the Singularity.

So, scarcity thinking at the top is my concern. I don't see any easy
way to change that thinking, so it's something that will have to be
gotten around. Call me paranoid (you're ONE OF THEM, AREN'T YOU), but
I'd say either be prepared to take your research underground once it
starts really getting somewhere, and/or make sure it's distributed
enough (e.g., through Open Source, publishing papers, etc.) that it
cannot be easily contained.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT