From: Martin Striz (email@example.com)
Date: Fri Jun 18 2004 - 08:06:25 MDT
Marc Geddes wrote:
Why not just write some kind of happiness-maximization algorithm?
Obviously, murderers tend to make lots of other people unhappy, so
their desires would be ignored. Also, the deleterious aspects of
religions, those which cause terrorism, wars, and strife, would also be
ignored, but those aspects that promote humanitarianism would be
included. I think what humanity really wants as a proximal goal is
happiness (as a distal goal, what humanity wants is to maximize
reproductive fitness, but our moral sense is usually blind to that goal
-- also, that goal may become obsolete).
Do you Yahoo!?
New and Improved Yahoo! Mail - Send 10MB messages!
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT