Re: SL4 meets "Pinky and the Brain"

From: Eliezer S. Yudkowsky (
Date: Tue Jul 16 2002 - 12:34:32 MDT

James Higgins wrote:
> Eliezer S. Yudkowsky wrote:
>> If that's true, it doesn't change the fact that Ben Goertzel has
>> posted a mathematical definition of how he would either ask an AI
>> to optimize the world according to his goal system C, or else
>> create a population of entities with goal systems N such that the
>> population-level effect would be to optimize C. Now this can be
>> argued as moral or immoral.
> You know, actually, I don't remember ever having seen such a post by
> Ben. For a period of some months I didn't read many of the posts on
> SL4 so I'm guessing this is why. Could you refer me to the
> post/thread in question, I would very much like to read that
> thread...

Oops, I got Ben's variables wrong. Serves me right for working from
memory. Ben's goal system is Y, the population's goal system is X.

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT