From: Lucas Sheehan (lucassheehan@gmail.com)
Date: Thu Apr 17 2008 - 16:01:03 MDT
On Thu, Apr 17, 2008 at 12:03 PM, Matt Mahoney <matmahoney@yahoo.com> wrote:
> --- Tim Freeman <tim@fungible.com> wrote:
>
> > From: Matt Mahoney <matmahoney@yahoo.com>
>
> > >We will want it to grant our wishes, to make us happy. So that is
> > >what we will build. But our evolved utility function does not
> > >maximize fitness when we can have everything we want. We will upload
> > >into fantasy worlds with magic genies. We will reprogram our brains
> > >to experience a million permanent orgasms. We will go extinct.
> >
> > Do you want us to go extinct? If not, then the scenario you describe
> > isn't what you want the AI to do. If you do want us to go extinct,
> > then I hope you're a minority.
>
> I think the majority do not want human extinction (even though you would not
> know the difference. Extinction is not death, it is the lack of birth). But
> if enough people believe that AI will result in human extinction (as I do),
> then it is sure to be outlawed.
>
Do you then think we should stop its persuit? Is your goal to
hinder/avoid/outlaw AI?
Or is this simple trolling? You've made a simple but strong statement
that flies against most of what SL4 is.
Or am I misunderstanding?
L.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT