Re: Re: What are useful for a phd?

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Dec 05 2006 - 15:45:12 MST


Hi,

> Great question. Becoming a computer science major is probably the
> worst thing you can do.

I think this is a major overstatement. Of course, I myself avoided
being a CS major and majored in math, but it is highly possible to be
a CS major and get a strong grounding in AGI. You just have to
realize that this grounding won't come automatically as a consequence
of studying CS, or of studying narrow-AI as presented in most
universities.

>Taking math is good, but avoid set theory and
> abstract algebra, and take a lot of statistics and probability.

And, I found that in getting my PhD in math, nearly all I learned was
irrelevant to AGI. I did avoid set theory and abstract algebra, and
mostly studied advanced analysis --- real, complex, functional,....
Fascinating stuff but not AGI-relevant in any direct way. Useful here
and there in narrow Ai for stuff like vision processing and SVM's.

My point is, no discipline -- CS, math, psych, neuro, etc. -- will
really teach you what you need to know to fully grok current thinking
on AGI.

What discipline you're centered in matters much less than whether
you're oriented toward gathering interdisciplinary knowledge that is
broad, deep, and appropriately focused.

> Things in a computer science program not relevant to AI:
> Learning how to program in twelve different computer languages
> Compiler theory
> Operating systems theory
> Computer architecture
> Complexity theory, other than knowing the diff between O(nlogn) and O(n*n)
> (here I'm using "complexity theory" to refer to i.e. understanding
> the difference
> between recursive, recursively enumerable, and co-recursively-enumerable,
> not to "complexity studies" a la the Santa Fe Institute)

Obviously, this kind of list is highly opinionated and individual...

As I think logic is more relevant to AGI than Phil does, I also think
some species of computational complexity theory are useful to know for
thinking about AGI.

For instance, I think the theory of Descriptive Complexity is
incredibly fascinating and might one day be part of an AGI theory.
Everyone should understand Fagin's Theorem ;-) ... the relationship
between logical frameworks and computational complexity classes is too
deep to ignore.... True, it's all about "worst case complexity" and
hence not directly applicable to AGI -- but I'll bet my uncle's left
elbow that eventually someone will generalize descriptive complexity
theory to average case complexity, and that the result WILL be useful
for AGI theory, telling you what kind of logical operations an AGI
needs to be able to carry out (explicitly or implicitly) in order to
solve what kinds of problems with reasonable efficiency...

> Things that people at universities generally teach you about AI, that
> I don't think are very useful for AGI:
> Game-tree search
> How to write a unification algorithm
> Natural-language processing
> commonsense reasoning, naive physics
> philosophy of AI

Again, this are very highly opinionated choices....

For example:
Game-tree search is not directly useful for AGI, but illustrates the
general principles of forward and backward chaining inference, which
arguably have general system-theoretic and cognitive importance, see

www.goertzel.org/dynapsyc/2006/ForwardBackward.htm

for some thoughts on this.

NLP covers a lot of areas, but my view is that statistical NLP has
taught us a lot about the nature of language, which is relevant to
understanding human intelligence and linguistic behavior and therefore
relevant to AGI. Even though, yeah, today's pragmatic NLP algorithms
are not the way a mind would process language...

> Possibly useful, but overemphasized:
> Lots of traditional symbolic-logic-based representation and deduction
> Expert systems

Now I would say "Expert systems" are totally irrelevant to AGI ;-)

> More useful:
> agent architectures
> psychology, esp. studies of perception, attention, learning, and memory
> neuroanatomy
> functional neuroscience
> probability theory, Bayes nets
> neural networks, optimization, genetic algorithm & programming

I agree with the above

> signal processing: Fourier transform, wavelets, Kalman filters
> data analysis: PCA, ICA, factor analysis, discriminant analysis,
> logistic regression, SVMs

I pretty much disagree with the above. It's worthwhile stuff but not
terribly relevant to AGI.

> dynamical systems theory, artificial life, catastrophy theory or chaos theory

Strongly agree...

> Remember that the purpose of getting a PhD isn't to learn - it's to get a PhD.

Well, fortunately, the two goals are not really mutually exclusive...

More to the point would be: A PhD program doesn't exist to teach you
what YOU want to learn, it exists to teach you what the DEPARTMENT
you're studying in wants you to learn.

I learned a hell of a lot of mathematics while studying for my PhD.
It happened that even at the time I was more interested in other
things, and spent more time studying weird stuff on the side as
studying the math I was supposed to be studying...

Some universities offer interdisciplinary build-your-own PhD programs,
BTW. I know University of Chicago used to, back in the day...

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT