From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Jun 06 2002 - 15:04:25 MDT
Ben Goertzel wrote:
>
> Gordon Worley wrote:
> >
> > You are in college and need to take a class in general relativity. You
> > notice, though, that there are lots of members of the opposite sex in
> > that oral communication class. In fact, there's one person in there you
> > like a lot and would like to get to know better. Taking that class will
> > set you back in your academic career. Assuming that you are in college
> > to learn and get some kind of degree, choosing to take the oral
> > communication class because it might let you get to know someone better
> > is an irrational decision.
>
> It seems that this is a matter of prioritizing short-term goals over
> long-term goals.
>
> To do so is not intrinsically irrational. What is irrational is *believing
> that one is acting optimally toward one's long-term goals*, whereas
> *actually* one is choosing one's actions toward one's short-term goals
>
> i.e., fooling oneself, by ignoring evidence
Let us by all means be precise...
Here's the chain of causality as I see it. Someone has short-term goals and
long-term goals which conflict. Let's say we have two fellows, one of whom
is more rational than the other. Both fellows, if they are not Zen masters,
will be attracted by the immediate short-term temptation, will see the
negative long-term consequences, and will feel cognitive dissonance as a
result; there is a sense of entitlement to the temptation and it would
require an expenditure of mental energy to refuse it. The difference
between the more rational and the less rational fellow is that the less
rational fellow may reduce cognitive dissonance by rationalizing away the
long-term consequences or rationalizing long-term support for the short-term
temptation. This path is less open to the more rational of the pair.
Cultural stereotypes are formed by observing departures from what is
believed to be the cultural norm. In this case more rational people, as a
result of structural differences in the thought process, emergently tend to
attend to long-term goals more than less rational people. This emergent
result is also reinforced by the abstract nature of long-term goals and the
concrete nature of short-term temptations; an ability to attend to abstract
thoughts equally with concrete thoughts is a talent that supports
rationality and is therefore correlated with it.
The resultant cultural stereotype, however, is based on surface
differences. It is supposed that "rationality" means being *biased toward*
long-term goals over short-term goals. I call this kind of stereotype
"macho rationality". Stereotypes of this kind are common in the cultural
perception of rationality, altruism, perfectionism, iconoclasm, and so on;
structural clarity which rules out a human-normal bias is seen as a bias in
the opposite direction. Similarly, rationality is often misinterpreted as a
bias toward discomforting beliefs - "macho rationality" again - where
rationality consists simply of focusing on truths whether they are
comforting or discomforting; a form of structural clarity which contradicts
the human-normal bias in favor of comforting beliefs, but does not consist
of a bias in the opposite direction.
> A great many scientific theories begin by someone starting with a theory
> they'd like to be true, and coming up with reasons why it might be true.
> This is abduction, in essence.
Abduction is observing something which is true, then increasing the support
of those hypotheses which are compatible with the observation. This form of
cognition is similar to "rationalization"; and rationalization can
manipulate slack in hypotheses to change their predictions post facto; which
is why science requires prediction *before* the fact.
> You are championing deductive inference, which is just one among several
> modes of inference. (And when probabilistic evidence is involved, deduction
> is not *absolutely* certain either, although it tends to have
> greater-confidence conclusiosn than induction or abduction.)
>
> I think that rationalization is, specifically, a *control mechanism for
> inference* that *specifically seeks to consider only evidence in favor of
> the truth of a certain proposition, rather than evidence against its truth.*
>
> It is not specifically tied to abduction, deduction, or any specific mode of
> reasoning, in my view. One can rationalize with pure deduction if one
> wishes.
If you use hypothetical abductive reasoning to assemble evidence then you
should use it equably to assemble both negative evidence and positive
evidence. This is in fact roughly equivalent to deductive reasoning.
> Also, I think that *rationalization itself* is a powerful heuristic for
> creative thought.
I disagree. I think that if you have a population of a thousand human
scientists, then most of them will be rationalizing humans. Some of them
will, by pure chance, rationalize ideas that happen to be correct, and their
rationalizing abductive processes may therefore, by coincidence, arrive at
correct conclusions. But this requires a large population of scientists.
Most of the rationalizers will drop out of the process because their
rationalized conclusions will be wrong, unless a great many scientists
rationalize the same conclusions simultaneously, in which case the whole
field goes down the wrong road. When you are using abduction properly you
are simultaneously trying to assemble supporting evidence and negative
evidence, unprejudiced. I do not think that trying to assemble only
supporting evidence leads to any greater intelligence. It may sometimes,
unintentionally, lead to science as a whole taking larger steps through the
fitness landscape - if the population of rationalizing scientists is
sufficiently large and sufficiently diverse.
If, historically, many scientists arrived at correct conclusions through
rationalization, this does not show that rationalization is a good strategy
for increasing individual intelligence. It shows that rationalization in a
population of scientists and the subpopulation of scientific iconoclasts can
sometimes approximate intelligence, given the scientific weeding process.
If rationalizing scientific discoverers outnumber rational scientific
discoverers, which has by no means been demonstrated to be the case, this
still would not in itself show anything about the native effectiveness of
rationality versus rationalization; it might just be that there are many
more rationalizers than rationalists. The proportion of effective
rationalists and effective rationalizers in scientific history will be a
product of the a priori proportion of rationalists to rationalizers in the
scientific population, not just the proportion of effective rationalists to
rationalists and the proportion of effective rationalizers to rationalizers.
"Historically, some scientists have rationalized correct conclusions."
-/->
"I can arrive at better conclusions by increasing rationalization."
Although of course, if you *want* to rationalize, "Some historical
scientists have rationalized correct conclusions" makes a wonderful
rationalization for that philosophy...
> However, really great creative thinkers may proceed by
>
> 1) rationalizing their way to an interesting conclusion, by careful
> inference control
>
> 2) then, AFTERWARDS, considering their conclusion in a more balanced way,
> looking at both positive and negative evidence equally
>
> In other words, I think that rationalization is an important part of the
> creative thought process, but it is dangerous if used alone, rather than as
> one stage in a multi-stage process
I have found that, despite all common wisdom, the most critical part of
being really creative lies in narrowing down inventiveness to correct ideas,
not in "brainstorming". I don't need brainstorming. I'm inventive enough
already. What's needed is inventiveness that hits correct targets and comes
up with ideas that are *really* true and not just ideas that *sound* true.
> > When you are trying to find a reason why your conclusion is true, you
> > being to feel like you are straining to come up with support. Stop
> > right there. If you don't know, you don't know, so don't make something
> > up, because whatever you make up is not helpful and will confuse
> > matters, an irrational choice.
>
> On the contrary, making stuff up is great. The important thing is to,
> afterwards, judge what you've made up by a harsher criterion (a fuller
> consideration of evidence).
I think that excellence in creativity is only achieved when you no longer
need to make stuff up. I think that while you're in the "making stuff up"
stage you occasionally hit the target, but you miss it just as often, and
that's not good enough. When you're in the "making stuff up" stage you have
your raw intuitions plus occasionally correct abductive hypotheses for those
intuitions of yours that happen to be correct; "making stuff up" contributes
occasionally correct rational support for those intuitions that happened to
be correct to begin with, but it doesn't let you refine your intuitions so
that they become more powerful. If you direct more skepticism at your
rationalized reasons then you may be able to spot some incorrect intuitions
because of the strained nature of their rationalized support, but this is
not actually rationality; this is a kind of irrational thinking arranged so
that some of the irrationalities cancel out.
Real rationality is very, very rare.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT