From: Rolf Nelson (rolf.h.d.nelson@gmail.com)
Date: Fri Apr 11 2008 - 19:00:58 MDT
On Thu, Apr 10, 2008 at 10:23 AM, Tim Freeman <tim@fungible.com> wrote:
> That sort of altruism is exploitable even without considering absurdly
> improbable hells. All I need to do to exploit you is breed or
> construct or train a bunch of humans who want exactly what I want.
In Causal Decision Theory, this is true. However, it's possible to
adopt a different decision theory that is specifically designed to be
less exploitable, without having to give up the things that you care
about.
> You need to fix the set of people you care about, rather than allow it
> to be manipulated by an adversary. You can't affort to give others
> the power to produce entities that you care about.
1. Please give your alternative suggestion for a morality that you
would actually endorse, and which is completely unexploitable.
2. Is your overall goal to make the world a better place, or have you
elevated 'be unexploitable' to a supergoal rather than a means to an
end?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT