Re: SL4 meets "Pinky and the Brain"

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jul 16 2002 - 12:59:05 MDT


James Higgins wrote:
>
> I don't believe that to be the case. The Sysop scenario you, yourself,
> suggested previously is highly immoral in my opinion. If you are
> correct about Ben then you have both proposed immoral (subject to
> perspective) goals.

James, I never, ever, ever suggested explicitly programming in the Sysop
Scenario. I thought (and still think) that imagining an FAI having to
serve as the OS of a universe provides an extreme way to test your
conception of morality - it's a more stringent test than we apply to the
morals that humans use to interact with each other. But that wasn't
originally the rationale of talking about the Sysop Scenario. The point
of the Sysop Scenario is that it provides a concrete disproof of these
three claims:

1) It is knowably the case that as intelligence and technological
ability increases, the ability of attackers to wreak asymmetrical havoc
on defenders goes on increasing indefinitely. Therefore we should
attempt to halt progress because existential risk is a monotonically
increasing function of progress, and a Singularity would inevitably
destroy the world by giving trillions of individuals the independent
ability to do so. There is no conceivable way to avoid this outcome.
Therefore the Singularity should be avoided or delayed as long as possible.

2) Although not actually resulting in the destruction of all
intelligent life, a Singularity will result in a colonization wave of
entities attempting to grab all resources in the universe. The threat
of instant destruction on any lapse of vigilance will force all
posthuman communities to live in fear of one another, and Darwinian
competition will eliminate all entities with goals that we would regard
as sympathetic, leaving no computing power for activities we would
regard as meaningful. There is no conceivable way to avoid this
outcome. Therefore the Singularity should be avoided or delayed as long
as possible.

3) Creating a Singularity will not actually alleviate misery; billions
of posthumans will simulate countless quadrillions of human-level
sentient beings; and while some of them might live in worlds pleasanter
than this one, quadrillions more would live in worlds as bad as this one
or worse. There is no conceivable way to avoid this outcome. Therefore
the Singularity should be avoided or delayed as long as possible.

The Sysop Scenario is not the only possible solution to any of these
three problems. It just demonstrates that at least one humanly
conceivable solution exists.

But apparently rational discussion of the Sysop Scenario just isn't
possible, even on the SL4 mailing list.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT