Re: Hacking your own motivational and emotional systems, how dangerous?

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Thu Oct 25 2007 - 10:36:19 MDT


If an agent could change its motivational system, it would figure out a way to
directly stimulate its reward system and go into a degenerate state. Animals
evolved to make this impossible for good reason.

If AGI takes the form of self improving or reproducing agents competing for
computing resources (which I think it will), then I don't think this should be
a big concern. Agents that can reprogram their motivations will not be
competitive.

--- Robin Brandt <mandelum@gmail.com> wrote:

> What are your opinions on the issue of changing your own goal system in your
> brain or in your potential post-singularitarian supermind.
>
> How dangerous would it be if anyone had the right to change his motivational
> system how ever she wishes?
>
> Should it be regulated?
>
> I look very much forward to the possibility of replacing my Darwinian drives
> with a more beautiful, consistent, constructive and moral goal system!
>
> Of course you can already do this to a certain level with your own will,
> reflectivity and discipline. But you canīt reach into your own super goal
> space, of course, since it would not be a good adaption.
>
> This relates to AI reflectivity and the friendliness stability issue. But
> here the question is about multiple minds that already have a human goal
> system to begin with.
>
> This has probably been discussed a thousand times, but I have not come
> across it yet, so I thought a post may be appropriate.
> Any pointers to articles, blog posts or earlier discussions are welcome!
>
> --
> ~Robin Brandt~
>
> Emergence, Evolution, Intelligence
> Control the infolepsy!
> Love the world!
>

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT