From: Ben Goertzel (ben@goertzel.org)
Date: Sat Apr 03 2004 - 19:33:17 MST
Michael Wilson wrote:
****
Designing a goal system that is stable under
self-enhancement, avoids radical changes to selection dynamics but is
still capable of converging on better moral systems and better forms of
Friendliness is ridiculously hard (in CFAI terms this is a combination
of higher-level structure and acquisition). All of these things are
essential to surviving the transition and creating a Singularity that is
open-ended in potential but still ultimately meaningful in human terms.
To my knowledge Eliezer Yudkowsky is the only person that has tackled
these issues head on and actually made progress in producing engineering
solutions
****
While I find Eliezer's ideas interesting and sometimes deep, based on
his posted writings I do not agree that he has "made progress in
producing engineering solutions."
The concepts outlined in his writings do NOT constitute "engineering
solutions" according to any recognizable interpretation of this term!
-- Ben Goertzel
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT