From: Ben Goertzel (ben@goertzel.org)
Date: Sat Aug 24 2002 - 18:50:34 MDT
Gordon wrote:
> > There may be many different psychological processes, in different minds,
> > resulting in similar rational behaviors.
>
> Only behaving in a way that looks rational is macho rationalism.
> Behaving rationally on accident is simply not good enough to be a
> rational *thinker*.
Behaving rationally *by accident* was not what I was talking about.
I was saying that there are no perfectly rational minds, and many different
approximately rational minds, which use different methods to approximate
rationality.
Similarly, there are no infinitely intelligent minds, just various more or
less intelligent minds, and these use different methods to achieve their
finite intelligence.
Whether a system that achieves intelligence or rationality purely at random
is "really" intelligent or rational, is one of those slippery philosophical
questions that doesn't interest me much.... Such questions seem to be more
about word meanings than about experienced reality.
What I was talking about was observing biases in my own unconscious thought
process, and consciously compensating for them on the conscious-reason
level -- but NOT being able to fully repair the biases, because of not
having full control over my own unconscious (a problem that AI's may have
much less severely).
Consider heuristics like
"To predict how long a software project will take, take your first estimate,
double it, then double it again."
"When you're hiking in the desert, things are a lot further than they
appear."
These are conscious, rational heuristics for postprocessing the results of
unconscious processing. We can employ these heuristics in order to improve
the rationality of our judgments. We can do this WITHOUT being able to
improve the rationality of the unconscious processes that create the first
estimates of project duration, or the first impression of the distance to an
oasis.
Are you saying that in a higher state of "truly rational" consciousness, one
doesn't see the oasis as being closer than it really is? One's perceptual
cortex is rewired to avoid this kind of incorrect, "irrational" judgment?
Even if this is true to an extent, I don't believe it can be completely
true. I think that nonrational thought processes are a necessary
consequence of the severe finitude of our brains, combined with the severe
demands we place upon them. I think that pure rationality would not be an
efficient strategy given the limitations of our wetware.
Making a similar statement about post-Singularity beings would be tough, of
course...
> I am not drawing the distinction along traditional psychological lines,
> so that's probable the problem. Maybe I shouldn't use the words
> `conscious' and `unconscious' and just stick with something like
> `feedback-able thought' and `feedback-less though'. To be fair I don't
> know a lot about psychology, but from what I know about neuropsychology
> I get the impression that all thought is the same qualitatively and the
> only difference is that some thought is feedback-able and some thought
> is not.
That is just not the case. At least, it's not believed to be the case by
the vast majority of cognitive psychologists.
> > Some forms of irrationality are good at creating new ideas, but bad at
> > testing, analyzing and refining them.
>
> Just because the ideas are `new' (and most are just new to you) does not
> mean that they are good or usable.
Of course not. Rationality is great at testing newly created ideas, at
least in some contexts (including math, science, philosophy), and it will
often throw them out.
>
> > Thus rationality works well in conjunction with some irrational thought
> > processes, which involve various types of quasirandom concept creation.
> >
> > The most brilliant scientists have generally combined intensely
> > effective
> > rationality with quirky but effective nonrational thought processes...
>
> Would you really call this process random and irrational? New ideas
> come from following paths that connect ideas (using an appropriate
> memory model, of course) to reach new ideas. Some of these links are
> weaker than others, but connections none the less. You first explore
> strong connections for new ideas, and if they fail, you try weaker
> connections. Eventually, you hope to reach some new connection that no
> one realized before.
Following pathways is one heuristic, but another is assembling compounds out
of existing conceptual components. This assembly process has a large random
aspect, in my opinion (this is based on introspective experience and AI
experimentation, not cognitive psychology; cog psych really hasn't gotten
there yet).
> > Irrational thinking can generate ideas that are later useful in rational
> > thinking. This is a subjective impression held by many many people and
> > quite thoroughly researched.
>
> Sure. And if I play roulette long enough, I'll probably win (of course,
> there is no statistical guarantee that this will eventually happen).
> Through directed thinking you can increase your ability to find usable
> ideas. I'm not saying that random thought doesn't work, but that it's
> inefficient and there are better ways.
Nonrational/quasirational intuition is much more than "random thought".
It is a specific assemblage of cognitive processes and structures, which has
some random components, and many other components as well.
It appears to be better at generating new ideas than the cognitive processes
and structures associated with conscious logical reasoning.
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT