Re: Metarationality (was: JOIN: Alden Streeter)

From: Gordon Worley (redbird@rbisland.cx)
Date: Sun Aug 25 2002 - 02:11:22 MDT


On Saturday, August 24, 2002, at 09:14 PM, Ben Goertzel wrote:

> Gordon wrote:
>>> Firstly, Gordon, I would like to invite you to define or describe what
>>> you
>>> mean by "rationality" as clearly as you can.
>>
>> Okay, I'm going to try to do this. This is a bit like asking a Buddha
>> to define enlightenment, though. :-P
>>
>> Rationality is a qualitative change in thinking. It is characterized
>> by
>> the consistent use of logical thought and Bayesian reasoning and the
>> disuse of evolved thinking (in all it's forms: intuition as it is
>> commonly understood, rationalization, pseudo logic, etc.). You
>> eliminate all irrational thought, and what is left is pure rationality.
>
> I think of rational inference as a kind of "evolved thought".... Where
> else
> do you think it came from?

There is surely overlap between good ways of thinking and evolved
thought processes. After all, evolution wouldn't work if it didn't
occasionally find the right answers. Still, removing bias from this
thought process is difficult enough that I consider the end result
different from the thought process you started out with.

> I don't believe it's possible to eliminate all irrational thought from a
> human brain. I am willing to be proved wrong, but I can't think of
> exactly
> how such a proof would be executed prior to having a full understanding
> of
> brain function and highly accurate brain scanners. I guess I'd be
> almost
> convinced if I met someone who qualitatively seemed to demonstrate pure
> and
> perfect rationality, but I have never met that person.
>
> Of course, I've never met you ;-)

I'm not proof that it's possible. I think, though, that it is possible,
and if it is I might just get there. It may turn out uploading is
necessary, but right now I'm working on the assumption that it's not.

> So, when I sit down to improvise at the piano, as a rationalist would I
> have
> to reason about which note to play next?

Yes. Playing the next note is a fulfillment of playing the song which
is a fulfillment of whatever reason you were playing the piano in the
first place.

> When writing an article, should I reason consciously & logically about
> which
> sentence to type next?

When writing, I find that I can't spend all of my time watching how I
come up with the words to write. As you train yourself, you don't think
irrationally in the unobserved parts of your brain. Until such time,
though, you have to proof read to see if what you wrote is logical.

Of course, I write quite slowly and in a nonlinear fashion because it
takes me a while to think up each sentence. If I am able to hammer out
something quickly, it is because I have already had time to think about
it. I suspect that this capacity for deep, slow thought (in humans they
are related, in AI I hope that this is not still the case) is one brain
feature that gives some people greater disposition to rationality.

> What about "going with the flow"? This seems to me to be a very
> efficient
> algorithm for producing good stuff. Reason seems to enter into such
> processes mainly
>
> * to judge things afterwards
> *when a particularly tricky spot comes up

I have no idea what "going with the flow" is even supposed to mean, so I
don't have any way of responding. Plenty of people talk about it, but
that doesn't mean I'm any closer to knowing just what this thought
process is. It's likely a thought process I've engaged in, but calling
it "going with the flow" does not make me think of anything in
particular other than write stream of consciousness style.

> If I had to consciously, logically reason about every note I played on
> the
> piano, or every sentence I wrote, then I would not get to do much of
> anything with my time...

Your brain does it for you. All you have to do is teach it not to be
influenced by irrational thoughts.

>> Maybe now you see why any use of irrational thinking bothers me. Sure,
>> you can keep using irrational thought processes, but you will only ever
>> gain so much rationality, and it will be tenuous since you are
>> maintaining a tie to irrational thoughts when you know that you should
>> be trying to eliminate them.
>
> I don't "know" or even believe that I should be trying to eliminate
> them.
>
> I think that nonrational thinking is highly adaptive for me and other
> human
> organisms. It needs to be kept in balance with rational thought, and
> the
> different aspects of both rational & nonrational thought need to be
> kept in
> balance with each other.

This sounds like an escape mechanism your brain would use to make sure
you don't stop trying to reproduce. Hard to do, but if you can there's
a good chance you are already rational.

> However, I nevertheless believe that a thought process incorporating
> *some
> nonzero degree* of nonrationality is maximally efficient for the human
> brain.

Nonrationality is relying on making decisions via evolutionary biases
or `at random', as much as that might be possible. I don't think any
such thing helps thought at all. At worst you might have to make weak
guesses rather than strong ones, but no need to resort to pseudo random
choices and evolutionary bias.

> And whether this holds for nonhumanlike AI systems, I also don't know.

I don't think a nonhuman AI could think this way, unless you
deliberately programmed it to make decisions based on pulling random
numbers out of /dev/random.

> I do know that in the Novamente design, we've had to make some
> compromises
> that effectively make Novamente irrational in some cases and contexts.
> Because to preserve pure rationality in all contexts is not
> computationally
> plausible.
>
> Now, you may say this proves Novamente is an inadequate or suboptimal AI
> design. I am sure it is suboptimal, but I hope it is not inadequate!
> And I
> encourage you or Eliezer or anyone else to propose a detailed design
> for an
> AI mind that is thoroughly rational, designed to achieve high levels of
> creativity without any significant nonrational elements ;)

Gordon will get back to you once he is an AI researcher. I expect
Eliezer is creating his AI design to encourage rational thought.

--
Gordon Worley                     `When I use a word,' Humpty Dumpty
http://www.rbisland.cx/            said, `it means just what I choose
redbird@rbisland.cx                it to mean--neither more nor less.'
PGP:  0xBBD3B003                                  --Lewis Carroll


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT