From: Michael Wilson (mwdestinystar@yahoo.co.uk)
Date: Thu Jul 14 2005 - 21:17:50 MDT
Ben Goertzel wrote:
> How about
>
> Goal C: Migrate to the Andromeda galaxy and use all the
> mass-energy there to advance mathematics, science and technology
> as far as possible; but leave the Milky Way galaxy alone, ensuring
> that it evolves over time very similarly to what would happen if
> no superhuman AI ever existed.
>
> This goal also doesn't mention humanity explicitly, yet seems much
> less dangerous than goal A.
This is simply a scaled up version of the realistic goal system 'do
comercially useful AI task X without optimising any part of reality
outside of your own hardware, excepting a neutral presentation of
the results'. Unfortunately although it sounds modest, even this
seems rather hard to formalise as a verifiably safe utility function;
in particular it is difficult to distinguish between harmless and
harmful side effects, rule out unintended secondary optimisation
effects and to pin down 'neutral presentation of the results'. Still,
this looks like a relatively tractable problem to me, though Eliezer
usually disagrees and I must admitt he has more experience of the
goal system design problem domain than me.
* Michael Wilson
___________________________________________________________
Yahoo! Messenger - NEW crystal clear PC to PC calling worldwide with voicemail http://uk.messenger.yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT