Singularity Fun Theory (was: Ethical basics)

From: Eliezer S. Yudkowsky (
Date: Fri Jan 25 2002 - 18:33:10 MST

    How much fun is there in the universe?
    What is the relation of available fun to intelligence?
    What kind of emotional architecture is necessary to have fun?
    Will eternal life be boring?
    Will we ever run out of fun?

To answer questions like these... requires Singularity Fun Theory.

    Does it require an exponentially greater amount of intelligence
(computation) to create a linear increase in fun?
    Is self-awareness or self-modification incompatible with fun?
    Is (ahem) "the uncontrollability of emotions part of their essential
    Is "blissing out" your pleasure center the highest form of existence?
    Is artificial danger (risk) necessary for a transhuman to have fun?
    Do you have to yank out your own antisphexishness routines in order
not to be bored by eternal life? (I.e., modify yourself so that you have
"fun" in spending a thousand years carving table legs, a la "Permutation

To put a rest to these anxieties... requires Singularity Fun Theory.

Behold! Singularity Fun Theory!

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT