From: Christian L. (firstname.lastname@example.org)
Date: Sun Jan 27 2002 - 13:24:21 MST
>From: "Eliezer S. Yudkowsky" <email@example.com>
>Subject: Re: Ethics and the Public
>Date: Sun, 27 Jan 2002 14:08:42 -0500
>"Christian L." wrote:
> > Not at all, their goal is to stop the Singularity from happening. A good
> > of achieving that is by killing the people involved in research. That is
> > flawed reasoning.
>Untrue. Vigilante violence affects nonprofits and universities before
>for-profit companies, for-profit companies before military projects, and
>military projects not at all.
>And starting out by
>attacking the Singularity Institute, which is the only AI project on Earth
>that accepts what it really means to be an AI project, the only project
>that cares enough about Friendly AI to put some real work into it, is the
>worst strategy of all.
Yes, I must concede that it would be bad reasoning. I don't think my
phrasing was all that good. The discussion was of militant religious
fundamentalists, and what I was trying to convey was that the reasoning
would be "better" than the reasoning of fundamentalists.
META: Sorry about the double posting above. I sent the first, and then
checked my mail an hour later. When it wasn't there then, I assumed there
hade been a bug or something, so I sent another.
Join the world’s largest e-mail service with MSN Hotmail.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT