RE: We Can't Fool the Super Intelligence

From: Simon Gordon (
Date: Fri Jun 25 2004 - 20:04:23 MDT

Tom Buckner wrote:
> I believe that superintelligence wishes to surround
> itself with more intelligence, to keep things
> interesting.

Its certainly comforting to think that the superAI (or
AI conglomerate if we are lucky) will end up choosing
the goal of increasing, among other things, the amount
of interestingness in the universe (it gives us a huge
chance of survival for one thing - or even
reincarnation if we die before the Singularity). I
expect their actual goal to be more abstract than is
currently comprehensible to us, but an increase in
overall interestingness may well be a side-effect.

>The universe is more interesting with us in it.

LOL. Really? Well you could be right, but given that
at some point the superAI would be capable of bringing
into existence any number of an infinite variety of
"designer intelligences", i tend to think that we
would only really be interesting from a historical
perspective...and is history all that interesting?
Maybe to some humans it is but i doubt whether higher
intelligences would be at all bothered about anything
like that. I mean im sure historians think they are an
interesting bunch of people, but IMO the only
interesting thing about them is their abililty to
repeat the mantra: "history always repeats itself"
when quizzed about the usefulness of the subject, and
they just dont see the irony there... If there is any
wisdom to be derived from this cliche then it is
rapidly becoming stale, and the further we head up the
double exponential curve of accelerating times the
more obsolete it becomes. Oops im starting to rant; i
think im just bearing a little grudge from my
schooldays when history lessons used to bore me stiff

Simon Gordon.

___________________________________________________________ALL-NEW Yahoo! Messenger - sooooo many all-new ways to express yourself

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT