From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu May 29 2003 - 19:02:04 MDT
Ben Goertzel wrote:
> Ok, I concede -- if that is really Bill's definition of happiness, then of
> course a superintelligent AI that is rigorously goal-driven and is given
> this as a goal will create something like euphoride (cf "The Humanoids") or
> millions of micromachined mannequins.
>
> Detailed specification of a richer definition of "human happiness", as
> hinted at in The Humanoids and its sequel novels, is an interesting and
> nontrivial problem...
The richer definition is, I think, volition; maximize self-determination,
minimize unexpected regret.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT