superintelligence and ethical egoism

From: Mitchell J Porter (mjporter@U.Arizona.EDU)
Date: Sat May 26 2001 - 13:36:55 MDT

Jimmy Wales said:

> (It wouldn't be very intelligent if it were anything but...)

A superintelligence whose supreme goal is X only needs to care
about itself insofar as its own continued existence will assist
the achievement of X. If its goal is to blow up the earth, then
once that is done it can attach zero value to further
self-preservation and shut down entirely.


> I don't think we should fear this, by the way. We should hope for it.

(this being egoism in a superintelligence)

If there is a large enough power differential between a superintelligence
and us, egoism will not imply any sort of mutualism. If it doesn't
care about us, if we have nothing to offer it, and if we're in its
way, we're toast.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT