Re: [SL4] AI --> Jobless Economy

From: Charles Hixson (
Date: Sun Mar 07 2004 - 14:43:38 MST

Yan King Yin wrote:

>At some point in progressive automation, the very
>meaning of "work" may start to break down. This
>is a very serious problem because human beings
I think the tense is wrong. It should be "is starting to break down",
or possibly even "is continuing to break down". This is something
progressive. Early attempts to handle it are in part responsible for
the vast increase of government bureaucracy. Jobs that weren't valuable
enough to do, are getting done because of the additional value of
keeping people employed. And our metaphysic doesn't allow the hoi
polloi any other way to earn their livelihood. (This is new... the
independant yeomanry [and it's analogous descendants] were the backbone
of anglo/american civilization, until around the early 1900's...possibly
into the 1940's.)

Civilizations don't have experience with rapid changes at a basic level
that aren't destructively traumatic. E.g., the reformed-Russian
government seems to be turning into what the Duma might have evolved
into, had Lenin not intervened. Civilizations try to be conservative.
Cheap labor isn't new...slavery is old. But robots are new. Sweat of
your brow won't pay for either food or shelter, much less both. Not
when the competition is a robot. (C.f.: John Henry and the Steam
Drill). But things that only affect small fractions of the populace can
be dealt with by suppressing the disaffected by main force. Have you
noticed that the govt. is increasingly monitoring the populace for signs
of suppress them, not to solve the problem that causes the
unrest. This is the traditional conservative approach. It's what's
always worked before. But as the disaffected proportion of the populace need more and more police and troops. That supplies jobs
for some..but if the only job is being a agent of suppression, what kind
of government do you have?

The exporting of jobs can be considered a form of "pre-adaptation".
This can only be done if you can sufficiently well define the interface
that the job requires. And if you can do that, then it becomes a
candidate for automation.

But each job that has the interface well-defined, also defines its
interfaces with the jobs it interacts with. And this leads to other
jobs being candidates for automation.

As another problem: In Athens, at the time democracy was being defined,
the spread in income between the richest citizen and the poorest was
about a factor of fifty. Even so, the rich ended up with more than
fifty times as much influence in the government. In the US today, I
don't know what it is, but it's at least several thousand. And the tiny
fraction at the top of the pyramid make the decisions that coerce what
everyone is supposed to do. Calling this a democracy doesn't make it
one. Oligarchy is closer. This has advantages, as oligarchies *can*
react more quickly to new trends. But first they must notice them, and
they only notice the ones that affect them directly. If one were to
take the "eye in the pyramid" as the symbol for this, the eye at the top
would represent those who decided what was good and bad for everyone
else to live by. One might call this "the burden of omniscience", as
they must know what's important at every level. And the rest are
supposed to ignore what they are really experiencing, and accept what
they are told as good and bad. One might call this the "burden of
nessience". But neither of these accurately describes what's going on,
they just describe what the society is expecting to work. When the
cognitive dissonance becomes too great, the pyramid collapses under it's
own fictions. (Appologies to R.A. Wilson, "Illuminatus")

Now this is too dangerous a set-up to be allowed to exist, so multiple
levels of checks and balances are put into place. Courts relatively
independent of the executive, e.g. And multiple levels of government
operating relatively independently. Unfortunately two things are
happening. The first is that this is a relatively clumsy system, with a
lot of built-in time delays. When things start changing quickly, it
can't adapt quickly enough. But the second problem is worse. It's to
the advantage of the people at the top to corrupt the checks and
balences, so that they can coerce decisions in the directions that they
feel to be proper. And the people at the top have considerably more
leverage on the internal workings of government than those nearer the
bottom. So the checks and balences not only stop working well, the
become corrupted with a bias in favor of the top. This tends to cause
increasing disaffection in the lower part of the pyramid, where the bulk
of the power resides. (The power at the top is concentrated, and easy
to bring into play. The power at the bottom is diffuse, and difficult
to control.) And as the corruption increased, the layer of the pyramid
at which the disaffection is occuring gets higher. At some point, it
will collapse. Having lots of government troops doesn't help much if
what breaks out is a civil war, as that just means that there are larger
armies fighting back and forth.

So there needs to be a new organization created before the disaffection
increases too greatly. But one can't expect the government to notice
that disaffection has increased to dangerous levels until after it is
far beyond merely dangerous. And the government *does* appear to have
noticed that disaffection is increasing. So either the levels of
disaffection are much higher than is obvious, or the government intends
to intentionally make it much worse. Recent statements about cutting
out Social Security seem to point towards the second choice.

I've said this in other contexts, but *I* think it bears repeating. If
it weren't for the actions taken by persons in power, I would feel that
all reasonable steps should be taken to slow the onset of the
singularity. As it is, I feel the singularity may represent our only
hope for survival, dangerous though it is.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT