From: Matt Mahoney (matmahoney@yahoo.com)
Date: Thu Sep 25 2008 - 08:35:54 MDT
--- On Thu, 9/25/08, Stuart Armstrong <dragondreaming@googlemail.com> wrote:
> > Self improvement requires a test or goal that cannot
> be altered through generations. Assuming that goal is
> "intelligence", we are not smart enough to test
> for it above our own level. If we are, then perhaps someone
> could describe that test.
>
> I've already proposed a gaggle of tests - mainly taking an open ended
> task (running a sucesfull company, organising an election campaign,
> etc...) with a clear relative standard of sucess, and setting the AI's
> head to head. A sucessful test just means a better understanding of
> what we really want.
If winning an election is good, then becoming supreme dictator of the earth is better. If running a successful company is good, then acquiring all of the world's wealth and starving the rest of the population is better.
Unfortunately, I can't come up with an alternative to competition among agents.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT