Re: Morality simulator

From: Peter de Blanc (peter@spaceandgames.com)
Date: Fri Nov 23 2007 - 00:57:09 MST


On Fri, 2007-11-23 at 14:41 +0800, Stefan Pernar wrote:
> What you 'value' is your 'utility function'. Let me explain:
>
> * Axiom: To exist is preferable over not to exist
> * Utility: Ensure existence -> make sure you do not die
> * Assumption: To exist is easier with the support of others
> * Assumption: Others share my axiom
> * Assumption: Others understand my assumptions
> * Assumption: The cost of deceit is higher than the utility of
> manipulation
> * Utility: Explain assumptions to others => communicate your thoughts
> * Utility: Be honest => do not hide anything
> * Assumption: Some neither share my axiom nor my assumptions
> * Utility: Respect all others as much as yourself and let them be =>
> know others but stay true to yourself
> * Assumption: The more that share this knowledge the better for
> everyone
> * Utility: Help others to understand your reasons for choosing your
> axioms, share the experiments that you get your evidences from in
> order to reinforce the positive feedback loop => be the change
> * Assumption: Nobody is perfect
> * Utility: Be forgiving and understanding
> * Utility: Strive for perfection through self improvement in becoming
> a mesoist
> * Assumption: Sometimes things go wrong despite all good will
> * Utility: Know and manage your existential risks and opportunities
> * Assumption: There is a limit for humans beyond which self
> improvement is not worth the effort
> * Utility: Create post human level rationality

You're confusing utility with expected utility. See Terminal Values and
Instrumental Values
(http://www.overcomingbias.com/2007/11/terminal-values.html).

 - Peter de Blanc



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT