From: Frank Adamek (firstname.lastname@example.org)
Date: Wed Dec 02 2009 - 19:08:21 MST
Unless of course my volition is to not be uploaded in certain ways. But more critically, your argument justifying the AI's behavior equally justifies a few other things. Which a slight alteration: "By killing both of you instantly and destroying your information, neither copy of you experiences any pain or fear, so your volition is satisfied, assuming ve was also programmed to satisfy your volition."
--- On Wed, 12/2/09, Matt Mahoney <email@example.com> wrote:
What is irrational or not is irrelevant to my argument. The AGI has been programmed to be truthful to humans. Therefore ve must upgrade your intelligence to ver level. Ve has a model of your mind, just like you have a model of your dog's mind that enables you to predict its actions. That model will maintain your evolved fear of death, because altering your memories, goals, or behavior would be dishonest. By killing you instantly, neither copy of you experiences any pain or fear, so your volition is satisfied, assuming ve was also programmed to satisfy your volition.
Need I point out that giving simple top level goals to Jupiter brains like honesty, or making people happy, or making paper clips can have unintended consequences?
-- Matt Mahoney, firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Thu May 23 2013 - 04:01:42 MDT