Re: Ben's "Extropian Creed"

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Nov 13 2000 - 22:24:31 MST


Ben Goertzel wrote:
>
> The analogy
>
> AI: humans
> humans: animals
>
> is flawed because humans and animals compete for resources far more than AI
> and humans will have to,
> due to our different natures

Why do you assume that humans and AIs will need to compete at all? I think
the happiest scenario is the one where everyone runs on protected memory/mass
partitions, with the "laws of physics" enforced by a Sysop that takes up a
minor amount of overhead and isn't interested in competing for space beyond
that.

Ben Goertzel wrote:
>
> It seems to me that in any resource-limited environment populated by
> autonomous agents
> -- before or after the
> Singularity -- it will be true that greater power/intelligence/ability leads
> to
> social dominance....
>
> You disagree with this?

Yes, I do. In the Sysop Scenario, a "static" uploaded human can "walk" up to
the most terrifyingly intelligent entity the world has ever produced and blow
a great big raspberry without incurring the most infinitesimal risk of dying
or even being hurt.

At most, there might be some kind of social scenario among the static uploaded
humans, so that people manage to spend all their time obsessing about how to
score points, and take revenge, and dreading the possibility of
embarassment... as long as we're mere humans, we'll never run out of ways to
make ourselves unhappy. But I expect that the transhumans, including my
future self, will have better things to do.

> Of course, humans have particular biological quirks as regards social
> dominance, but the
> desire on the part of each agent for more resources is pretty much an
> inevitable consequence
> of evolution... which leads at least to a kind of abstract social dominance
> hierarchy...

It's quite possible that the Citizens will be capable of trading computational
resources back and forth, but hopefully some "minimal living space"
requirement will be enforced... possibly as a restriction upon the trading
activities of the Citizens who exist, but certainly before a Child Citizen is
created. (The ethical rationale for placing this restriction on trading
activities would be that it represents an irrevocable harm to any future
versions of the Citizen's self, which may have changed their minds, or may
even be changed so much as to be different entities.)

Note that I would expect a six-billionth of the Solar System's mass
(approximately 10^23 grams), to be a very substantial multiple (many orders of
magnitude) of "minimal living space".

Disclaimer: The Sysop Scenario is a comprehensible Singularity and
automatically suspect. The above mass estimates do not take into account
possible dodges such as the manufacture of negative and positive matter in
equal quantities, the Linde Scenario, the creation of private Universes,
faster-than-light travel to other star systems, reality engineering,
ontotechnology, or outright magic.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT