From: Tom McCabe (rocketjet314@yahoo.com)
Date: Wed Aug 01 2007 - 17:45:26 MDT
--- Dagon Gmail <dagonweb@gmail.com> wrote:
> Humanity has shown itself so stubborn, so conceited,
> and so elitist,
> as a quality of almost genetic proportions, the
> implications are
> staggering. We scarcely discarded out pleistocene
> hunter-gatherer
> genes and are already adapting faster than credible
> to modern
> mass-murder realpolitik. The genes allowing such
> creative leaps have
> been migrating steadily upwards in the gene-pool,
> hand in hand with
> all the damn money.
Evolution has not had any significant influence on the
human species for the past thousand years due to time
limits and the recent lack of selection pressure; see
http://dspace.dial.pipex.com/jcollie/sle/.
> My intuition screams at its loudest, for what it's
> worth, the following points.
>
> 1 - a small but expanding number of extreme thinkers
> in all 3
> remaining superpowers are contemplating the
> emergence of completely
> unorthodox technologies. Russia has shown it thinks
> most outside the
> box, with it's references to He3 mining and nanotech
> superpower
> ambitions and notions of orbital energy stations,
> but the US, with
> it's nutty thinktanks and New American Century
> rhetoric has done its
> fair share of wild eyed speculation too.
No government-sanctioned or political think tank
speculation that I've seen mentions AGI. PNAC's agenda
is a political system, not a new piece of technology.
> 2- right now I sincerely believe that no sincere
> powerplayer anywhere
> inhabiting some consolidated power ivory tower,
> anywhere in the US or
> Russia or China will endanger his credibility and
> career by actually
> suggesting that pseudo-raelians like Kurzweil
See http://www.sl4.org/archive/0511/12949.html on why
transhumanism is not a cult.
> could
> actually have a
> point - publicly that is. However I am positive the
> filthy cthonian
> tentacles of the sith are even present here, on this
> forum, in the
> shape of junior think tank interns and other
> assorted imps and minions
> of darkness.
If I were offered a position as a junior intern in a
(sane, non-political) think tank, I would take it.
Does this make me an imp or minion of darkness?
> 3- I am certain superpowers are terrified and
> extremely defensive
> about emerging technologies
Er... like what? Biowarfare and ICBMs are both
well-established technologies, and so far as I know,
we don't even have a well-organized plan in place to
deal with those. If the US were terrified of AGI, they
would not have granted SIAI 501(c)3 status.
> and have been building
> reserves of all
> kinds to weather... unknowns... accidents. From
> hidden swiss bank
> accounts overflowing with pentagon money, to
> consolidated oil fields
> and salt mines with unprecedented barrel reserves, I
> am sure and I
> have seen clues the superpowers are cautious.
Political cautiousness has little to do with AGI; if
we did actually develop AGI, none of this stuff would
help anyway.
> Take
> for instance the
> paranoid race to develop and keep secret the
> spinoffs of the
> Metalstorm technology. Metalstorm has already shaken
> several strategic
> ballances, and it's still as simple as staplers.
>
> What if some lunatic comes up with cold fusion? What
> if some lunatic
> comes up antigravity - the brass ponders with a
> furrowed and sweaty
> brow. And that's only linear predictable stuff.
Cold fusion and antigravity are both physically
impossible according to currently accepted theory;
they are predictable in the sense that you can very
confidently predict neither will ever happen.
Antigravity requires negative energy density, which
allows for solutions to the Einstein equations (see
http://en.wikipedia.org/wiki/Alcubierre_drive) which
permit FTL travel, causality violation, and a whole
bunch of other weird stuff. Cold fusion is impossible
because you cannot invoke the strong force (which is
responsible for fusion) over ranges long enough to
allow nuclei to be far apart (and thus in low energy
states).
> The
> same brass,
> generally elder men with straightforward linear
> intellects, will have
> trouble seeing the implications of smart, evolving,
> modular robotics
> with a portable fabber parts womb. The average
> transhumanist gets
> sparkles in his eyes when I say that, but 99% of
> pentagon staff will
> look puzzled and try wiki-ing what I just said.
99%? Why not 99.999%? Out of six billion people in the
world, there are only a few thousand transhumanists,
and I don't see any reason why transhumanists would be
attracted to jobs as Pentagon staff officers.
> And
> then most of them
> will think I am one of those star trek losers.
> Convince them however
> that such a think could be a reality before 2025 and
> they'll get
> seriously nervous.
We have a very hard time convincing technophiles of
the benefits of the Singularity (see
http://www.wholeearthmag.com/ArticleBin/111-6.pdf by
Cory Doctorow). How are we going to convince
government bureaucrats, even if we wanted two?
> 4- My intuition screams, again,
Your intuition fails when placed into unusual
situations. See
http://en.wikipedia.org/wiki/List_of_cognitive_biases
for a long list of all the situations in which your
intuition has been experimentally demonstrated to
fail.
> for what it is
> worth, that elements in
> all major superpowers have by now come to the
> conclusion that "it
> wouldn't be so bad if a major percentage of people
> succumbed to a
> variant of passive demise", somewhere in the next
> 20-50 years. I am
> sure there will be (a) reports detailing how such a
> terrible thing
> could happen have existed for decades, (b) studies
> on how to
> repopulate earth with people less inclined to be
> homosexual or liberal
> or or potheads or french have been completed, (c)
> there may even be
> studies to ehm coalesce such a horrible idea into
> ehm a post-reality
> state. Purely speculative of course, but only
> because the heathen
> communist slopes have a similar program.
>
> As such one thing is clear: I do not trust people in
> power. Am I
> wrong, after having witnessed several genocides as
> casual topics in
> the news, just after TV-reportings of Paris Hilton
> and blipverts? I am
> positive people are scum, as a rule, and once given
> a good reason and
> a few billion dollars, everyone, even that nice lady
> across the
> street, can sink to the moral equivalent of Dick
> Cheney.
See
http://en.wikipedia.org/wiki/Fundamental_attribution_error
on why stressful situations can turn people evil.
However, people who believe they have a great deal of
moral responsibility (due to the tremendous impact of
the Singularity on the world) are likely to be much
more resistant to this then your random nice lady,
especially if they've studied the subject before.
> Which leads me to one conclusion worth mentioning
> here: people like
> Elizer, people who make the same bold techno-erotic
> statements and
> have the same eloquent charisma and credibility,
> will, at some satured
> point in the future, receive a visit from men in
> dark suits, bearing
> suitcases with money, blonde pleasure-slaves and
> other assorted
> temptations to lure them to all kinds of black ops
> think tanks.
Eliezer's ethical system sounds like it would be
fairly easy to manipulate; just place him in a
situation where the optimal path to a Friendly
Singularity coincides with whatever you want him to
do. Governments may be bad at technology development,
but they're fairly good at psychological manipulation.
> They will try and buy the elizers of transhumanism
> when they start
> believing. Once the first stray transhumanist with
> fiery eyes gets
> his first visit by a present day mefistopheles, we
> can make a sure
> assumption skynet's evil twin-brother is less than 5
> years away.
>
> Dare I say "muhuahua?"
>
We've had a hard time finding people who are skilled
enough to work on an AGI project, and we've been
trying for the past seven years; how the heck is the
government going to do it?
- Tom
____________________________________________________________________________________
Shape Yahoo! in your own image. Join our Network Research Panel today! http://surveylink.yahoo.com/gmrs/yahoo_panel_invite.asp?a=7
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT