From: Mikko Särelä (msarela@cc.hut.fi)
Date: Thu Aug 18 2005 - 00:49:07 MDT
On Wed, 17 Aug 2005, Richard Loosemore wrote:
>
> This hypothetical paperclip monster is being used in ways that are
> incoherent, which interferes with the clarity of our arguments.
>
>
>
> Hypothesis: There is a GAI that is obsessed with turning the universe into
> paperclips, to the exclusion of all other goals.
I believe the problem here is mostly one of perception. Perhaps one should
not even use the word AI, because it gives people entirely wrong idea of
what one is talking about. Instead, one could speak of an optimization
process which is tuned in such a way to become better and better in
creating paper clips.
That process would learn about its environment and optimize its internal
workings in order to become better in creating paper clips. That is, if
the optimization process was made correctly and no changes that the
process made to its decision algorithms were such that the system behavior
changed.
It does not necessarily have a mind, mind you. Thus, it is not prudent to
talk about it conceiving things, or being too busy to stop and think, etc.
It just isn't part of it nature to do that.
-- Mikko Särelä "I find that good security people are D&D players" - Bruce Schneier
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT