From: Michael Wilson (mwdestinystar@yahoo.co.uk)
Date: Thu Sep 15 2005 - 21:13:20 MDT
Richard Loosemore wrote:
> Russell Wallace wrote:
>> You also need to assume the world has no structure.
>
> I agree entirely!
>
> If this were a discussion on a mailing list about AGI or the
> Singularity, rather than a mailing list devoted to the mathematics
> of probability, I owuld say this was farcical.
If a theory or researcher can't handle the basic, simplest cases
correctly, it has effectively no hope of handling more complex
cases correctly. It would be nice to say that everyone understood
these extremely simple examples of normative reasoning, but
evidently this is not the case, and attempting to proceed without
rectifying the deficiency will result in the person in question
falling flat on their face, predictably and repeatadly. I did so
myself several times before realising the importance of explicitly
ensuring that the fundamentals of reasoning are present and correct.
Ben Goertzel wrote:
>> Ben, I don't think you have a strong grasp on Bayesian probability
>> theory "as traditionally deployed" - at the very least you need more
>> practice applying it to example problems before you develop a decent
>> feel for it.
>
> Well, Eli, I don't think you have a strong grasp on my psyche or my
> knowledge of mathematics ;-) I was the 7-year-old kid who could do
> all sorts of advanced algebra but still vexed his teachers by sometimes
> making dumb careless mistakes in addition problems!
Eliezer wasn't criticising your general mathematical ability, which I'm
sure is excellent. Considered in isolation, probability theory is quite
simple algebra. However /applying/ probability theory to any real or
imagined problems is a formalisation/grounding problem, which requires
specific expertise that isn't strictly maths. It's particularly difficult
because until you've had a lot of practice your standard (broken) human
intuitions will fight you and try to induce you to make mistakes in the
grounding and hence the maths. Beginners have a nasty tendency to jump to
unsupported conclusions in probability theory, because their intuition
is telling them (incorrectly) that 'X implies Y' and they translate Y
into the corresponding equation without checking the derrivation. I
would hesitate to call you a beginner at applying probability theory,
but you did make quite a hash of the problem we just discussed; it may
be that your reluctance to apply rigour to your arguments is causing
you to shoot yourself in the foot. I didn't used to like rigour either,
because it slows things down and dampens those wild flights of invention
and intuition that seem so satisfying, but then I realised that I was
rationalising mental laziness and that the rigour is essential because
the mistakes are cumulative. You seem to be complacent with making 'small
mistakes' here and there because you believe that it's better to
concentrate on getting the 'big idea' right, but I think that in AGI
those kind of small mistakes are indicative of reasoning flaws that have
a /cummulative/ affect on the probability of building something that
works. Which is to say, it doesn't take many 'small mistakes' before your
success probability goes to effectively zero. I suppose the point is that
the philosophical and psychological legacy surrounding AI can make it
seem like social science or even art, but it turns out that it's actually
more like physics; AGI demands conceptual and mathematical rigour.
* Michael Wilson
___________________________________________________________
How much free photo storage do you get? Store your holiday
snaps for FREE with Yahoo! Photos http://uk.photos.yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT