Re: [sl4] Irrational motivations (was [sl4] A Thought Experiment: Thou Shalt Not Bear False Witness)

From: Frank Adamek (f.adamek@yahoo.com)
Date: Wed Dec 02 2009 - 19:08:21 MST


Unless of course my volition is to not be uploaded in certain ways. But more critically, your argument justifying the AI's behavior equally justifies a few other things. Which a slight alteration: "By killing both of you instantly and destroying your information, neither copy of you experiences any pain or fear, so your volition is satisfied, assuming ve was also programmed to satisfy your volition."
-Frank Adamek

--- On Wed, 12/2/09, Matt Mahoney <matmahoney@yahoo.com> wrote:
What is irrational or not is irrelevant to my argument. The AGI has been programmed to be truthful to humans. Therefore ve must upgrade your intelligence to ver level. Ve has a model of your mind, just like you have a model of your dog's mind that enables you to predict its actions. That model will maintain your evolved fear of death, because altering your memories, goals, or behavior would be dishonest. By killing you instantly, neither copy of you experiences any pain or fear, so your volition is satisfied, assuming ve was also programmed to satisfy your volition.

Need I point out that giving simple top level goals to Jupiter brains like honesty, or making people happy, or making paper clips can have unintended consequences?

 -- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT