From: John K Clark (johnkclark@fastmail.fm)
Date: Thu Jun 12 2008 - 11:28:54 MDT
On Thu, 12 Jun 2008 "Stathis Papaioannou"
<stathisp@gmail.com> said:
> Evolution might not favour foolish goals or fixed goals
There is no “might” about it.
> but that does not necessarily have anything
> to do with intelligence.
I believe having a foolish goal does have something to do with
intelligence, and having no fixed goal is the only way to avoid getting
caught up in infinite loops when that goal that looked so reasonable
when you were young ignorant and stupid you now know to be
contradictory. And some goals become so anachronistic they turn to
gibberish, such as, “explain why there are only 5 planets”.
> If a person wants to kill himself, the
> significance of his being intelligent is
> that he will more likely be able to overcome
> obstacles put in his path
I would humbly submit that over the course of events this has not proven
to be a significant factor. It takes very little intelligence to die, in
fact it takes none at all, just stop reacting to anything and you will
soon get you wish.
I have been accused of using anthropomorphic reasoning and it’s true,
but I make no apology for doing so. Everybody uses it, I believe being
good at it is one of the things that drove the rapid increase in hominid
brain size, because the singe most important factor in the environment
is the behavior of other animals, especially members of your own
species.
My anthropomorphism of a Jupiter Brain is that of a brilliant and very
complex person, the “friendly” AI anthropomorphism is the of child of
Uncle Tom and lassie, only with better SAT scores.
What is it AI, has Timmy fallen down the well again?
John K Clark
-- John K Clark johnkclark@fastmail.fm -- http://www.fastmail.fm - A fast, anti-spam email service.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT