From: Ben Goertzel (ben@goertzel.org)
Date: Sat Aug 24 2002 - 18:19:55 MDT
James Rogers wrote:
> On 8/24/02 12:38 PM, "Ben Goertzel" <ben@goertzel.org> wrote:
> >
> > It's reasoning based on the integration of a huge number of
> weak pieces of
> > evidence. This is the kind of reasoning that consciousness
> cannot handle.
> > In humans. Whether consciousness will be able to handle it in
> AI's is an
> > interesting question. I think the answer will be no, for a while. But
> > post-Singularity I'm not going to venture a guess....
>
>
> My objection to this is primarily that most intuition that is
> worth anything
> CAN be resolved through introspection and thought. People can
> usually piece
> together the reasons for their intuition if they think about it
> hard enough.
> Or at least I can.
In some cases, I have had strong intuitions about things, but found it took
months or years of study and effort to validate these intuitions in a step
by step, logical way.
Sometimes this process has resulted in great stuff, other times, it's
resulted in nothing so far, and maybe never will (maybe they're bad
intuitions, or maybe my conscious reason isn't up to those particular
tasks).
It's not surprising if you and I have different internal mental processes
however. Brains are various!
> Being right for the wrong reasons is the
> same thing
> as being wrong as far as I'm concerned.
But what's at issue is being right for largely-consciously-unknown reasons,
not for reasons known to be wrong...
ben
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT