From: Marc Geddes (marc_geddes@yahoo.co.nz)
Date: Mon Nov 08 2004 - 01:34:57 MST
>Forward the Singularity!
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence Could you hurry up with the Singularity Eli? I've had it with pre-Singularity existence. Seriously, I'm starting to get worried. I'm not so sure that the goods guys will win. I'm no longer so sure we're going to pull through. Your appeal does seem to have shocked some people out of their complacency, which is good, but it is also depressing. I feel the 'window of opportunity' for a successful Singularity is closing. Goertzel, Voss and yourself seem to be the only serious AGI projects but they can't seem to muster sufficient funding or a sound enough theoretical basis. I feel my 'life force' starting to slip away (I'm 33 now - getting old). Existential threats are starting to pop up. I fear we're not going to make it :-( I'm not conconvinced that even the conceptual fundamentals of FAI theory are yet understood, yet alone the full fledged formal understaning. Sing Inst seems to have the strongest theoretical basis but there is a huge gap in your understanding: Personhood! You must grasp the nature of 'Personhood'. Only then a Bayesian Jedi will you be ;) But hurry, hurry! ===== "Live Free or Die, Death is not the Worst of Evils." - Gen. John Stark "The Universe...or nothing!" -H.G.Wells Please visit my web-sites. Sci-Fi and Fantasy : http://www.prometheuscrack.com Mathematics, Mind and Matter : http://www.riemannai.org Find local movie times and trailers on Yahoo! Movies. http://au.movies.yahoo.com
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:48 MST