From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Oct 06 2001 - 17:15:41 MDT
While I have some minor points of disagreement, this is very well written
and thus exactly what I was looking for. Three cheers for Mitch Howe.
The major point of disagreement on my part is that I think that global
patterns follow from individual choices. Most people will run as
simulations, but not because simulations are more "efficient" - if you
divvy up the solar system among only six billion people, nobody's going to
be forced into simulation for lack of room. People will run as
simulations because they want to - because they want the total
environmental control or the ability to expand into superintelligences.
Post-Singularity we'll have a lot more resources than we do now; people
may *choose* to do certain things for the sake of efficiency, but there
will be very little that people *need* to do for the sake of efficiency.
Not every singleton superintelligence ("singleton": one to Human Space)
constitutes a Sysop. I think the phrase "Sysop" implies partitioning of
resources among citizens, individual volition, and so on. A benevolent
dictatorship seems to me like a different scenario. A Sysop, in other
words, is a root process that manages the way in which other processes
interact; a benevolent dictator is a root process that tampers with other
processes for their own good. The Sysop Scenario is supposed to suggest
the former. (Note that I am not suggesting the latter is impossible -
just that we need different terminology.)
If a governing superintelligence determines that each citizen can only
produce one offspring per subjective century, this being the sustainable
long-term population growth rate given individual growth and the rate of
acquisition of new resources, that's a Benevolent Dictator scenario.
Under a Sysop Scenario, the population growth rate is the emergent result
of property rights and child abuse laws. At the time of Singularity, each
of six billion initial citizens gets a resource allotment, and there's a
Minimal Living Space requirement to create a new citizen to prevent
peonage-based coercion. This MLS would be the amount of RAM/resources
required to go on thinking "forever" given the expected rate of overall
growth - that is, you won't run out of mental living space before the New
Territory opens up. The requirement would be even higher to create a new
citizen with the desire to reproduce. In both cases the reason is the
same - so that you can't create a citizen that wants to live, or
reproduce, and then control it by threatening to withhold the resources ve
needs to accomplish these things. In other words, to create a citizen,
that citizen has to be self-sufficient.
Under this scenario, one of us (i.e., one of the six billion beings born
before the Singularity) might be able to create a few trillion offspring
immediately after the Singularity using his or her allotment of Solar
matter - but those trillion offspring won't be able to create another
trillion offspring in turn, and none of them (including the original) will
run out of room to think before the Sysop manufactures more paired
negative/positive matter, opens up a new Van Den Broeck(*) pocket, splits
off another universe, or whatever. However, reproductive rates are likely
to be substantially lower, and MLS sustantially higher, if we have to go
all the way to Alpha Centauri at C to obtain more mass. If resource
acquisition is geometric rather than exponential in the long run, then
reproduction will also become slower as time goes on. Even individual
subjective rates may need to become slower so that personal growth doesn't
exceed available resources. I tend to regard that as the "pessimistic"
scenario, although it may still permit an infinite amount of subjective
fun over infinite time.
The main case where you have something resembling benevolent-dictator-like
behavior inside a Sysop Scenario is a case where you care what other
people think and they are all thinking the same thing, thus transmitting
their volitions to you as a pressure; i.e., if you transform yourself into
a malicious superintelligence you may no longer be able to send any
communications to other citizens, because they don't want to spend the
computing power needed to analyze the message in toto for brainwashing
effects. This also works for unavoidable entanglements; if the Luddites
are all staying on Earth, you may not be able to stay on Earth as a
transhuman.
(*) Keywords: Chris van den Broeck, tardis, micro-warp, Alcubierre,
bubble.
http://www.npl.washington.edu/AV/altvw99.html
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT