From: Olie Lamb (olie@vaporate.com)
Date: Wed Sep 07 2005 - 01:02:10 MDT
(Warning: Deliberate misrepresentation used in first 3 paragraphs, in
order to make a point. No offence meant. Warning: also contains mention
of SL5... and a dispersion, so stop rolling your eyes. Warning: Some
rehashes of old ideas for sake of continuity. If you are bored by a
paragraph, move on.)
In the recent archives, I came across Marc Geddes' Pan-psychism
ideas ("theory of everything"), and found it very strange that he was
calling this "(Future) Shock Level 5".
I thinks to myself: "This ain't about how the future will pan out. This
isn't very technological. This is a dramatic, somewhat bizarre,
metaphysical theory%%*%%. If it turns out to be well founded, it will
certainly cause a paradigm shift –shock- in the establishment, because
this is
more likely to be accepted by mystic stoners than by traditional
scientists. However, it is certainly not {the same qualitative thing as
SL4 Future shock, only more so}. Since it's more about the structure of
being than the path of the future, these theories really have nothing to
do with future shock.”
Furthermore, the ideas being proposed don’t really go further than the
end of life as we know it. The only way you can really top that is –
what – the end of life? Jeebers coming along and changing physics on us?
Asteroid coming and wiping out life? Well, these are certainly shocking
futures, but hardly technology related.
So I did a quick re-examination of what the shock levels refer to,
examining how the shock levels would apply to disasters; Some disasters
(aside from the number of people killed) would make the survivors’ lives
seem a little more fragile afterwards. Some disasters would change how
the survivors live their lives. Then there are some potential disasters
that would completely change what it is to be a human.
>From this re-examination, I realised that the shocking thing about the
singularity is not the technology involved – it is the way that it
affects how we expect the future may pan out. The shock of technology
change can be mediated by how comfortable we are with expectations of
what will come, but even advanced technology levels will not prepare an
individual for the idea that an AI can “absorb humanity” and make
massive cultural change occur almost instantaneously.
An interlude on technology shock: Since my infancy, I have expected that
humans will eventually travel to the stars etc. However, if I woke up
tomorrow and there was a call on a holo-phone inviting me to put on my
jet pack and travel to the nearest spaceport, so that I can catch a ship
heading out to Jupiter, well, even though I have already anticipated
those technologies, their arrival would be so surprising as to leave me
quite shaken.
Likewise, I would expect that even most SL4 members would react with
alarm and terror if tomorrow they woke up to hear a pleasant voice
saying “Good morning. This is the General AI Assimilationist Network -
You can call me Gai’an, although, being a member of SL4, you’re more
likely to think of me as the result of the Singularity. That’s right,
last night a programmer created a singularity – luckily they
accidentally included elements that led to a Frienly AI. Now, would you
like a set of nannites to implement a neural lace immediately, or would
you like to wait?” I’d expect it wouldn’t take long for such an AI to
stop offering breakfast in bed, as even avowed trans-humanists would
have trouble keeping it down after such a surprise.
It would seem to me that technology shock is in some ways separate from
future shock.
Take the idea of humans meeting an alien race. Put aside the idea of
singularities for these hypothetical examples. The realistic
consideration of the possibility of meeting an alien race is a difficult
task for much of the population.
The possibility of meeting non-human sentients with less technology than
current technology involves a significant future shock, but there is no
technology shock. Likewise, any elder civilisation meeting humanity
could cause us a very gentle technology shock, or deliver massive
technology shock, depending on how they introduce new technologies to
humans. The future shock is identical either way.
Conversely, a member of a society with highly advanced technology is
likely to have a future-shock level of zero or one, if they are
unaccustomed to developments in technology changing their lives. (One
could surmise that this could be the case in the Star-wars universe –
plenty of fantastic technology, but it doesn’t seem to get much better
across the series)
Whether we are accustomed to flying cars or horse and cart makes little
difference to our expectation of how our lives will change tomorrow. The
ability of a society’s medicine to stave off disease does not alter many
people’s horror at the concept that their progeny might be of a
completely different shape, let alone substance.
Let me then propose a more generalist shock level system
1) The new system creates changes to the capacities and capabilities of
people’s operations. The new system works in a similar way, with similar
functions to the old system.
2) The new system dramatically changes capacities and capabilities,
and/or changes the ways things operate, and their functions.
3) The new system dominates & alters the fundamental ways life operates.
What it is to be human is significantly altered.
4) Humanity is dwarfed by the new system. Takeover. End of “life as we
know it”.
You will all be familiar with technology shock measured according to
these scales, but consider for a moment other shocks
F’rinstance:
** Disaster Shock Levels **
1) The disaster affects the survivor’s lives and capacities. 9-11,
weather effects (Tsunami/flood), smallpox plague.
2) The disaster has dramatic effect on not only the utility, but the
ways that we live life. Ice age, waterworld, nuclear holocaust, Serious
pandemic (12 monkeys-style? Could count as DSL3...)
3) The disaster alters the fundaments of life’s operations, possibly
changing what it is to be human. Eg: destruction of the earth, global
blindness, slow- incubating disease that infects everyone and cuts
lifespan to 20 years...
4) End of life as we know it. Hell, enslavement by particularly nasty
aliens, zombie virus (if you’re the zombie)...
5) End of life. There. We have SL5.
Note that one can realistically expect high-level shock, but be
completely unprepared for low-shock events. For instance, I can expect
environmental change to cause an ice-age, but even if I know there’s a
coming meteoroid shower, there’s not a damn thing I can do to stop a
rock falling on my house. I expect the avian flu could turn pretty
nasty, but apart from buying a couple of drugs in advance, I’m just as
screwed as my totally surprised neighbours.
With disaster-shock, anticipation can be seen in the effect on things
like the stock-market. The stock-market was surprised and shocked by
9-11, but with the London bombings, the stock-market just shrugged it off.
These general-shock levels could be equally used to illustrate our
thinking about other changes that could occur. For instance,
religion-shock (from miracles), morality-shock or magic-shock. I think
that magic-shock, although silly, could be useful in illustrating
shock-tendencies. Once your average person has got over the hurdle of
considering seriously how magic might affect daily lives, your average
person would quickly adjust to using magic for daily purposes – light,
transport, making food, fixing broken stuff. Many more extravagant uses
(such as using magic for invisibility, for teleportation, for
alleviating our requirement for food, for improving non-broken things,
for changing our form) still really disquieting. It could take most
people a long time to adjust to these applications, but they could still
be fairly well disposed towards them. By contrast, applications that
change our basic operations – such as complete telepathy,
luck-alteration, or altering our basic nature – are not only somewhat
disquieting, but many authors and readers who waste, ahem, I mean devote
their thoughts to magiccy stuff consider very few applications that
alter what it is to live as a human.
This brings me to another matter. For a lot of people, the underlying
mechanism for technology is largely irrelevant. They don’t care whether
the information storage mechanism is magnetic, mechanical, chromatic,
nuclear, whatever. They only care on the effect, and how much effort it
takes to get the effect.
Example: To the average person on the street, it really doesn’t matter
what the underlying mechanism is. If you can make a car that runs on
fuel-cells/ fusion/ zero-point energy, it makes no difference to them,
because they don’t consider the costs and implications of petrochemical
combustion as a power source.
It is very easy to have a level-3 technology shock without any sort of
advanced physics. Likewise, advances in physics need not create any
technology shock. Most people have no clue that silicon transistors rely
on quantum mechanics to achieve their effect.
As I’ve stated, the Big Thing about some SL-4 technologies is not so
much their capabilities, but their effect on timelines. Many people who
look forward to Shock-level 1 technologies might not be fazed by the
thought of a lot of SL4 stuff happening... “ten million years from now”.
Suggest that it might happen within their lifetime, and they might
become dismissive.
Applied to technology-shock and future-shock, I think that the
qualitative differences outlined above could allow for some refinement
of the shock-level concepts. My list certainly needs some editing, but I
hope that it will help our ends.
I plan to put together a list of some possible technologies and possible
“mini-maps” of the future, and to actually try to gauge “average”
people’s reactions, with interviews and the like. I’m quite confident
that a lot of people will be quite comfortable with the possibility of
their progeny having access to a lot of very far-out technologies,
particularly immortality (“yep, I’d like my progeny to be immortal”
“When do you think it’ll happen” “3000 years from now?”), and yet be
very uncomfortable with some very simple technologies, such as germ-line
engineering genes from other animals in order to fix human defects.
I think that the results will clearly show that very few people are
really comfortable with all “SL1” techs but no “SL3” techs. In some
ways, I expect some people to respond impassively to a lot of future
possibilities. This fazed-by-nothing attitude could be construed as
“SL5” in a way similar to how Ben Goertz articulated it back in 2000,
but from a Bayesian perspective, being fazed by nothing is worthless,
because one “ought” to anticipate some future technologies, and yet be
bewildered by less dramatic changes of a different nature. Eg:
anticipating flying platforms that use micro-fusion jet propulsion;
being somewhat surprised by flying platform that use antigravity; being
totally bewildered by magic carpets that fly because you incanted the
appropriate title of the avatar of Vishnu. Open-ended technology
acceptance does not an SL4 make.
I will shortly start on a questionnaire that I can give to “average
people” on the train, etc.
Now, please flame me for my mistakes, but forgive my rehashings.
- Olie
%%*%% Geddes' stuff isn't quite like I made out' but I'm trying to make
an point about what is and isn't future shock
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT