From: James Rogers (firstname.lastname@example.org)
Date: Sat Aug 24 2002 - 14:40:26 MDT
On 8/24/02 12:38 PM, "Ben Goertzel" <email@example.com> wrote:
> It's reasoning based on the integration of a huge number of weak pieces of
> evidence. This is the kind of reasoning that consciousness cannot handle.
> In humans. Whether consciousness will be able to handle it in AI's is an
> interesting question. I think the answer will be no, for a while. But
> post-Singularity I'm not going to venture a guess....
My objection to this is primarily that most intuition that is worth anything
CAN be resolved through introspection and thought. People can usually piece
together the reasons for their intuition if they think about it hard enough.
Or at least I can. I may use intuition as a shortcut if I don't have time
to evaluate my intuition, but I usually can reason through my intuition
given some time, and I am fairly skeptical of my intuition particularly in
areas where I don't have a lot of expertise.
Marketing, sales, and con artists all use and abuse the fact that people's
intuition about things is wrong in a number of different ways and can be
easily manipulated. As I believe has been shown in a number of studies (I'm
too lazy to google it, but I remember reading them), people's "intuition"
about most things is so routinely wrong that it should be heavily
discounted. In the real world, intuition that you can't decipher has a
really bad track record. This is particularly true in the absence of
sufficient information. Being right for the wrong reasons is the same thing
as being wrong as far as I'm concerned.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT