RE: SL4 meets "Pinky and the Brain"

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Jul 16 2002 - 17:12:13 MDT


But Eliezer, in the post you cite, I was discussing how to measure the
similarity between different goal systems (in response to a question from
you), not discussing any particular plans for how to build an AGI or
engineer the future.

I was actually not proposing to create an AGI that would explicitly reshape
the world according to any particular goal system.

Rather, I was discussing, for each possible goal system X, the fuzzy set
w(X) of possible worlds that are likely to result from creating an AGI with
that goal system X.

For instance, if the AGI goal system X is "ignore all other beings and do
not interfere with them at all, spend all your life solving math problems in
solitude", then there is a certain fuzzy set w(X) of possible worlds, which
is similar to the set of possible worlds that obtain in the case that there
is no AGI at all.

I was stating that two goal systems X and Y should be considered similar if
the possible world sets w(X) and w(Y) they lead to are similar. That's all.

-- Ben G

> -----Original Message-----
> From: owner-sl4@sysopmind.com [mailto:owner-sl4@sysopmind.com]On Behalf
> Of Eliezer S. Yudkowsky
> Sent: Tuesday, July 16, 2002 12:35 PM
> To: sl4@sysopmind.com
> Subject: Re: SL4 meets "Pinky and the Brain"
>
>
> James Higgins wrote:
> > Eliezer S. Yudkowsky wrote:
> >>
> >> If that's true, it doesn't change the fact that Ben Goertzel has
> >> posted a mathematical definition of how he would either ask an AI
> >> to optimize the world according to his goal system C, or else
> >> create a population of entities with goal systems N such that the
> >> population-level effect would be to optimize C. Now this can be
> >> argued as moral or immoral.
> >
> > You know, actually, I don't remember ever having seen such a post by
> > Ben. For a period of some months I didn't read many of the posts on
> > SL4 so I'm guessing this is why. Could you refer me to the
> > post/thread in question, I would very much like to read that
> > thread...
>
> Oops, I got Ben's variables wrong. Serves me right for working from
> memory. Ben's goal system is Y, the population's goal system is X.
>
> http://sysopmind.com/archive-sl4/0206/0747.html
>
> --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT