From: Martin Striz (firstname.lastname@example.org)
Date: Tue Jun 06 2006 - 14:11:00 MDT
On 6/6/06, Robin Lee Powell <email@example.com> wrote:
> Again, you are using the word "control" where it simply does not
> apply. No-one is "controlling" my behaviour to cause it to be moral
> and kind; I choose that for myself.
Alas, you are but one evolutionary agent testing the behavior space.
I believe that humans are generally good, but with 6 billion of them,
there's a lot of crime. Do we plan on building one AI?
I think the argument is that with runaway recursive self-improvement,
any hardcoded nugget approaches insignificance/obsolesence. Is there
a code that you could write that nobody, no matter how many trillions
of times smarter, couldn't find a workaround?
This archive was generated by hypermail 2.1.5 : Sat May 18 2013 - 04:01:01 MDT