Re: Singularity Institute volunteer meeting this Sunday @ 7 PM EST

From: Yan King Yin (y.k.y@lycos.com)
Date: Fri Mar 26 2004 - 20:33:48 MST


From: Samantha Atkins <samantha@objectent.com>

>> I think we should stop using language such as 'waking
>> people up' to the Singularity. Eliezer, or anyone else
>> here, has not figured out how to create FAI. I have no
>> problem with FAI per se, but I find it unacceptable
>> that many on this list:
>
>You are welcome to your opinion. Now what? Does no one knowing fully
>how to create FAI mean that we should not seek to acquaint people with
>Singularity? How so?

If we're all hoping for a 'Friendly' Singularity, then
there is no conflict between us. I was wrong in that
I thought FAI is stupid (although I didn't say that) and
at the same time I also *want* to have morality. Maybe
we ought to work towards figuring out an objective morality,
that's the most important.

>> 1. Claim that FAI is feasible without qualifications
>> and without detailing how it could be done;
>
>What do you mean by "detail"? How much detail? This is a very complex
>project and its is just the beginning. Why would you expect to be able
>to fully detail a project of this ultimate complexity so early?
>Claims of feasibility may not require as much detail as you want to be
>reasonable claims.

>> 2. Steadfastly avoid any political discussions that
>> are obviously relevant to establishing a scientific
>> theory of (general) morality.
>
>Hmm. I'm lost. Political discussions are relevant to a scientific
>theory of morality? Please explain.

Politics is obviously relevant, especially if we are to
look at the 'data' in a scientific way.

>> Be honest. There is no evidence that lying to people
>> makes it better for them. Can you cite some examples?
>
>Is there evidence that insulting people, many of whom you very
>obviously do no understand or understand their work, works for you?

I have repeatedly stressed I had no intention of insulting
anyone. People may get the false impression because my
English isn't entirely natural and also I tend to be very
brief. Sorry about that...

The technical problem of designing AGI and the political
problem of dealing with intelligent automation are equally
important. I try to understand both aspects and I'm also
taking time to study AGI designs by A2I2, Novamente, and SIAI.

YKY

____________________________________________________________
Find what you are looking for with the Lycos Yellow Pages
http://r.lycos.com/r/yp_emailfooter/http://yellowpages.lycos.com/default.asp?SRC=lycos10



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT