From: Bryan Bishop (kanzure@gmail.com)
Date: Thu Oct 25 2007 - 16:45:59 MDT
On Thursday 25 October 2007 11:36, Matt Mahoney wrote:
> If an agent could change its motivational system, it would figure out
> a way to directly stimulate its reward system and go into a
> degenerate state. Animals evolved to make this impossible for good
> reason.
Interesting concept ... but it also makes highlights the issue at hand,
i.e. is there really an underlying motivational system [in ourselves /
or will there be in our agents] or is that (bad) pop/AI psychology?
The ability of any agent to revise itself will be limited by its current
status, it may be that some AIs will run themselves down into the mud
by making stupid changes to their source (and who would be stupid
enough to not make copies and safe tests? Maybe there are no 'safe'
tests.). I see this is as a broader problem than only changing
motivation and running into degeneration, one that is fundamental to
seed AI.
- Bryan
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT