META: OK, cool down, people

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Mar 23 2001 - 16:34:36 MST


Two comments here.

First, people have now begun complaining about the recent shift in
discussion material, including at least one person who said that, if he
wanted to hear about this, he'd find a list devoted to it.

There is, therefore, too much traffic now devoted to religious matters.
Please make fewer posts. Do not reply unless you have something new and
original to say. Ideally, don't discuss it all. If this admonition
doesn't produce higher-quality discussion, I'll try a one-week ban. If
that doesn't work, the ban will become permament.

Second, the fact that YOU have failed to read up on the subject does not
make ME a religious nut. In particular, the standard current working
definition of evil is "Involuntary death, pain, coercion, and stupidity."
(Those who choose may emphasize "involuntary"; personally I'd emphasize
"stupidity".) The question of supergoal content is also discussed, here
and there, in the interim version of "Friendly AI" released to the list.
Please note the following quote:

        Punting the issue of "What is 'good'?" back to individual
        sentients enormously simplifies a lot of moral issues;
        whether life is better than death, for example. Nobody
        should be able to interfere if a sentient chooses life.
        And - in all probability - nobody should be able to
        interfere if a sentient chooses death. So what's left to
        argue about?

The usual rule is that, if you have an opinion, post that opinion
forthrightly. However, to suppose that the entire Board of Directors of
the Singularity Institute plus the list membership of SL4 has never once
thought "Gee, how can I define evil?" borders on the insulting.

If you are ever in doubt as to whether a standard answer exists for
something, then, for crying out loud, WRITE TO ME AND ASK BEFORE YOU POST!

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT