From: Matt Mahoney (email@example.com)
Date: Thu Apr 17 2008 - 13:03:31 MDT
--- Tim Freeman <firstname.lastname@example.org> wrote:
> From: Matt Mahoney <email@example.com>
> >We will want it to grant our wishes, to make us happy. So that is
> >what we will build. But our evolved utility function does not
> >maximize fitness when we can have everything we want. We will upload
> >into fantasy worlds with magic genies. We will reprogram our brains
> >to experience a million permanent orgasms. We will go extinct.
> Do you want us to go extinct? If not, then the scenario you describe
> isn't what you want the AI to do. If you do want us to go extinct,
> then I hope you're a minority.
I think the majority do not want human extinction (even though you would not
know the difference. Extinction is not death, it is the lack of birth). But
if enough people believe that AI will result in human extinction (as I do),
then it is sure to be outlawed.
-- Matt Mahoney, firstname.lastname@example.org
This archive was generated by hypermail 2.1.5 : Wed Jun 19 2013 - 04:01:27 MDT