From: Matt Mahoney (firstname.lastname@example.org)
Date: Mon Mar 09 2009 - 08:16:26 MDT
--- On Sun, 3/8/09, Vladimir Nesov <email@example.com> wrote:
> What if you create a simulation in which you torture and
> murder 10^100 people? Does it become OK if you erase all the evidence?
That depends on what your ethical model (1) counts as a simulation and (2) says about simulated murder and torture and (3) says about undoing your actions.
Suppose I claim that running autobliss ( http://www.mattmahoney.net/autobliss.txt ) with 2 negative arguments (simulating negative reinforcement regardless of the action of the agent) is 10^-20 as evil as torturing and murdering a human (or pick a number > 0). Then running 10^120 copies would be as evil as torturing and murdering 10^100 people. I can write an equivalent but more efficient program that produces the same output for the same input and run it on my laptop. Instead of reporting that it felt 1000 units of pain and died, it reports 10^123 units of pain and 10^120 deaths.
Is that unethical? If not, then define which Turing machines count as a simulation of torture and which don't.
-- Matt Mahoney, firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Tue Jun 18 2013 - 04:01:03 MDT