From: Chris Hibbert (hibbert@mydruthers.com)
Date: Mon Nov 27 2006 - 15:58:13 MST
Phillip Goetz:
> I don't think that a drive for self-preservation is necessary to reach
> superintelligence. I do think that once there are numerous adaptive
> superintelligences in existence, those with a drive to expand and
> consume more resources will get more resources than, and eventually
> supplant, those without it. It may be that the drive will be that of
> humans controlling the superintelligences rather than of the
> intelligences themselves, but I don't see that makes much difference.
I interpret that as saying that later on, when there is an ecology of
competing SIs, there will be reason to expect the self-preservation
drive, but in the early stages, when they first appear, it might not be
present.
I think the distinction is most relevant when discussing the sudden
emergence of SI (a scenario that I'm skeptical about anyway). If SI
emerges suddenly without nearly super-intelligent precursors, I wouldn't
expect it to incorporate a self-preservation drive. Arguments about
what we need to defend against in the early phases of a rapid-onset
singularity shouldn't assume that the SI has built-in drives of that
form. If it's truly super-intelligent, it will be likely to rationally
deduce that preserving its own ability to act is of value to itself, but
it won't be a deep, instinctive, autonomous reaction as we see and
expect in evolved creatures.
Chris
-- Pictures from my trip to the Four Corners area: http://discuss.foresight.org/~hibbert/Canyon02/canyon.html Chris Hibbert hibbert@mydruthers.com Blog: http://pancrit.org
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT