Re: Michael Anissimov's 'Shock Level Analysis'

From: Brian Phillips (deepbluehalo@earthlink.net)
Date: Thu Jan 17 2002 - 03:22:07 MST


----- Original Message -----
From: Jeff Bone <jbone@jump.net>
To: <sl4@sysopmind.com>
Sent: Thursday, January 17, 2002 1:27 AM
Subject: Re: Michael Anissimov's 'Shock Level Analysis'

>
>
> "Eliezer S. Yudkowsky" wrote:
>
> > What is this strange fascination that the word "SL5" seems to exert over
> > people? Why do so many people, confronted with SL0 through SL4, feel
this
> > impulse to top it by coming up with SL5?
>
> One possible reason with some amount of merit:
>
> There're a number of folks in "the community" for whom the Singularity is
> viewed as "opaque," apocalyptic, something beyond which nothing can be
known
> --- or at least, beyond which probabilities for various scenarios cannot
be
> calculated at all. For these folks, the Singularity is a kind of Rapture,
an
> almost mystical, eschatological event in itself. Let's call those folks
> "transhumanists" with a nod to their tendency towards viewing things from
a
> kind of genericized humanist viewpoint. Let's say those people are the
ones
> for whome SL4 is really a terminal concern. These people are, for
example,
> very concerned about the potential loss of individual human lives in the
time
> between now and Singularity.
>
> SL5 folks would then be those for whome the truly long-term outcomes of
generic
> intelligence and the universe itself --- the cosmological eschatology
implied
> by physics and technology --- is the thing of interest and motivation, who
do
> not reject the idea that even today's knowledge can tell us something
about the
> post-Singularity long-term. For these folks, Singularities aren't
> eschatological in any sense; they're just business-as-usual, a standard
stop
> along the road from eukaryotic life to ascendant intelligence. Let's call
> those folks the "posthumanists" and posit that they've abandoned
priorities of
> individual or even species survival in favor of the notion of ultimate
survival
> of intelligence in general.
>
> I'm not sure that there are really hard-and-fast boundaries between these
> things; any classification scheme (like the existing SL levels) is a
coarse
> and inaccurate tool. However, it might be useful to consider these two
points
> of view as relatively distinct.
>
   I would have to chime in that while Eli's Shock Levels is coarse and
neccessarily
innaccurate..this is even worse! "Transhumanist" is a word that already has
an established meaning, similarly "posthumanist". This is like me hijacking
"Friendly AI" to refer to my new super-programmable Furbie!
  SL3 people do not neccessarily follow the above schema. I'm hanging
out on SL3, my basic position is that I do not yet have sufficent grounding
in cogitive science (though it's my field of study) to be able to tell
whether Friendly Seed-based >H AI is something we will be able
to do on the short time scales, and whether a single hard takeoff
is especially likely.
I don't think of the Singularity as a rapture.. It's just an emergent
property... not unknowable..just really really difficult. Ontotechnology
and magic physics for the Sysop is something lots of SL3ers reject.

brian



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT