From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Nov 06 2007 - 11:53:06 MST
Joshua Fox wrote:
> Under "I'm-sure-someone-must-have-done-this-before":
>
> What about the idea of an morality simulator. Just as computer models of
> weather or car crashes -- however imperfect -- allow researchers to test
> their assumptions, why not do this for morality?
Because unless you narrowly restrict the available options to Tit for
Tat like behavior, it's too hard. You can't get simulation of general
consequentialist reasoning without general intelligence. Never mind
simulations of tribal alliance formation and linguistic political
persuasion. This is very advanced stuff, cognitively speaking.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT