From: Stephen Reed (stephenreed@yahoo.com)
Date: Mon Oct 03 2005 - 12:29:25 MDT
Regarding significant differences between Ray's
projections on the Singularity and my own:
1. Ray believes the path to what we call an AGI, is
to model the large scale human cognitive functions in
software. In contrast I think that a direct
engineering approach will succeed before Ray's
approach.
2. Ray does not say much about the dangers of an
uncontrolled or poorly executed AGI. Much discussion
here is about achieving a safe AGI. I share those
concerns.
3. Ray does not say much about the organizations,
projects or individuals that will create an AGI. I
believe the U.S. military and to a lesser degree the
U.S. intelligence community is funding (good
old-fashined) AI and robotics research and will pour
funding into AGI projects once evidence appears that
AGI could in fact happen.
Cheers.
-Steve
[note that I will stop using my Cycorp email to
further distance myself from representing Cycorp or
its sponsors - which I do not.]
--- David Massoglia <DMassoglia@pontiac.k12.mi.us>
wrote:
> I am a huge fan of Ray Kurzweil and believe him to
> be extraordinarily
> gifted. I would be curious to hear from the many
> intelligent people on
> this board if they have any "significant"
> differences from Kurzweil's
> opinions and projections on the Singularity or other
> significant matters
> and why. Again, I am only interested in
> "significant" differences and
> will leave that to your judgment.
>
> Thanks,
>
> David
>
>
__________________________________
Yahoo! Mail - PC Magazine Editors' Choice 2005
http://mail.yahoo.com
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:23:04 MST