From: Michael Vassar (michaelvassar@hotmail.com)
Date: Tue Feb 14 2006 - 12:44:06 MST
Truly bizarre. You assert that you believe knowledge is *always* preferable
to happiness, and support it with a datum showing that knowledge is
*sometimes* preferable to happiness, e.g. in the volcano example.
It is in fact a trivial consequence of game theory that knowledge can make a
group of individuals worse off in net, or that an individual's knowledge is
often more costly to others than it is beneficial to himself.
As for the supposed dysfunction of the 6 billion or so humans who can be
made unhappy by having knowledge (Technically, what else *could* someone be
made unhappy by. People are information, their unhappiness is also
information, and is in fact, among other things, the knowledge that they are
unhappy) so what? Declaring people to be dysfunctional because they fail to
meet certain standards that you set for them is definitely not humane. It
seems to me to be more like the vindictive preference of one of evolution's
creations that the environment should impose selective stresses which he is
atypically well equipped to handle, not unlike the hope that a strong man in
a tribe might hold that wars and wild beasts will come and enable him to
translate his strength into status and differential survival relative to his
weaker peers.
A culturally benighted person, prehaps from an Islamic country, could be
made worse off by knowing that his daughter is not a virgin. (honor
killings)
A sensitive, philosophical Westerner of Christian background, IQ 125, high
openness, and high neuroticism could be made a great deal worse off by
knowing that there is no reason to believe in his mythology (existential
dispair possibly leading to clinical depression).
You can be saddened by your cursory familiarity with thermodynamics (a
little knowledge is a dangerous thing, learn it for real, with the math, and
I think you will feel better), but appearently not by the Scylla and
Charybdis of chaos and determinism which one must also pass before becoming
comfortable with GAI.
Michael Wilson could hypothetically find himself in a situation where very
violent people very badly want him to program a seed AI for them before
Friendlyness theory is complete. In such a situation he would not be able
to comply with their wishes, and would be made worse off by knowing the
details of what his friends and family were experiencing as a result.
Each of these scenarios involves people falling along a continuum of
increasing mental resiliance, but ultimately I believe that they differ by
degree, not by kind. A FAI would not suffer in any of these situations
because it cannot suffer and is not a person, but I would be deeply
skeptical of any supposedly Friendly output which rapidly lead to people who
were not vulnerable to the fourth scenario. (ignoring issues of
invulnerability due to the implausibility of the situation
post-singularity).
> > That quote seems to me to be an appealing one for people who are more
>able
> > to endure unpleasent truths than their contemporaries, but it is not a
> > humane preference. That people can be made unhappy by correct knowledge
>as
> > well as by incorrect is a classic literary theme, and obviously one with
>at
> > least some truth behind it. Even relatively rational people might want
>to
> > know neither their opponent's poler hand nor the time of their future
> > inevitable death, and for the bulk of humanity even a spouses's
>extramarital
> > attractions or the mythologized status of a historical hero can be
>painful
> > with little attendant benefit. I would be extremely reluctant to accept
>a
> > singularity that eliminated people's right to ignorance as "Friendly" or
> > that incautiously embraced the preferences of a person's extrapolated
> > informed self and forced that person to live with the choices implied by
> > those preferences.
>
>This is an interesting point. Personally, I consider that knowledge is
>always preferable to happiness. The typical situation that comes to mind is
>that of people living close to a volcano. Of course, there are items of
>knowledge not as important or vital as that.
>
>I'd say though, that if knowledge makes you unhappy, you're dysfunctional.
>And yes, I'm probably dysfunctional for a certain domain of things, like
>say
>thermodynamics. Ignorance on extramarital attractions or the truth behind
>heroic myths seem to me to do actual harm rather often though, but I won't
>get stuck on the example.
>--David
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:23:29 MST