From: Ben Goertzel (ben@goertzel.org)
Date: Wed Apr 30 2003 - 17:30:57 MDT
I agree that a book on Eliezer's FAI and related ideas would be more useful than a book on rationality. A chapter or two on rationality, with a focus on rationally thinking about AI, FAI and the Singularity, would fit in nicely too.
ben g
> -----Original Message-----
> From: owner-sl4@sl4.org [mailto:owner-sl4@sl4.org]On Behalf Of Ramez
> Naam
> Sent: Wednesday, April 30, 2003 6:17 PM
> To: sl4@sl4.org
> Subject: RE: Singularity Institute - update
>
>
> Instead of the roundabout strategy of writing a book on rationalism,
> how about:
>
> 1) Writing a book on your AI and FAI ideas?
>
> 2) Publishing your AI and FAI ideas in AI journals?
>
> 3) Pursuing a PhD in AI? (which would force you to do #2)
>
> Any or all of the above would have the advantages of:
>
> a) Spreading your ideas to other people working in AI. FAI could go
> from a project that you and a few other people are working on to a
> mainstream consideration for AI work. This would seem to reduce the
> overall risk of a non-friendly AI being inadvertently built by some
> other AI research group.
>
> b) Giving you more direct credibility. Scientific publications,
> technical books, and mainstream credentials all increase your ability
> to raise funds from private sources, acquire funds in the form of
> grants, and convince others of your ideas.
>
> A book on rationalism seems very low leverage to me. It doesn't
> specifically target people with the skills to work on FAI or people
> with the potential resources and inclination to help fund FAI.
>
> cheers,
> mez
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT