From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sun Oct 24 2004 - 08:54:17 MDT
Ben Goertzel wrote:
>
> A start toward this would be for Eliezer to write a systematic treatment of
> his ideas about FAI in the form of a (technical or semi-technical book), to
> be published by a typical academic press. It's easier to get eccentric
> ideas published in book form than in academic journals, because the
> refereeing process is different.
I have a suspicion that this would take a downright amazing amount of time
- on the order of four years or more.
> It may seem odd to recommend paper publication of ideas that are already
> online, but the fact is that paper publication is taken more seriously by
> the powers that be. And I think that the process of putting one's ideas
> into a more "permanent" form often forces one to tighten up one's lines of
> argument.
Isn't that what "Levels of Organization in General Intelligence" was for?
It worked, too, but it took four times as long as I allocated. And as for
"permanent" form, it goes without saying that two years later I was
sneering at the primitive naivete of my own theory.
> Furthermore, there are plenty of European universities that will give a PhD
> based on research only, no coursework. After publishing a treatise on FAI,
> it might well be possible to get a PhD on the basis of that book. I have
> some particular European university connections that might be helpful in
> this regard. I also believe I could help with the publication of the book,
> as I know a few editors at academic presses.
Hm. Sounds like a lot of conjunctive probabilities...
...still, tempting enough that I might take you up on it, but only if the
Singularity Institute completely bogged down otherwise. It's the huge
amount of time required that's the problem. I'm a slow writer, and I seem
to become a slower writer as I become a better writer.
I wouldn't *do* FAI, I'd update LOGI to incorporate Bayes and information
theory, and write the book on the evolutionary psychology of human
"significantly more generally applicable" intelligence. If I wanted
academic respect and a PhD, I'd write on a subject that I already fully
understood, people were already interested in, and that was at least
theoretically possible to explain.
But oh, the time required! Gobs and endless gobs of time! I shudder just
to think of it. And *it might still not work*, and I'd *still* have to do
all the FAI theory and I wouldn't be as young when I did. All my life has
taught me the value of not being distracted. That's why I account it a
last resort.
I wish I wasn't such a slow writer. It would open up more options. But
that's not something I've figured out how to change.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT