Re: How hard a Singularity?

From: James Higgins (jameshiggins@earthlink.net)
Date: Wed Jun 26 2002 - 14:37:57 MDT


At 03:59 PM 6/26/2002 -0400, Eliezer S. Yudkowsky wrote:
>James Higgins wrote:
>>Actually, I was thinking about this earlier, glad you asked.
>>I think the best solution would be to assemble a board of maybe 10
>>people. These people should be intelligent, but not necessarily
>>geniuses. Some should be experts on AI, but not all. I would say the
>>criteria ALL members must posses would be:
>> 1. A strong desire to see the Singularity occur
>> 2. Strongly value human life and the survival of the human race
>> 3. Must be willing and able to accept that a solution, other
>> than their own, is a better solution
>>The deciding body SHOULD NOT have exactly coinciding interest. They
>>should not, under any circumstances, all be working on the same project
>>(such as the Singularity Institute).
>
>James,
>
>Do you believe that a committee of experts could be assembled to
>successfully build an AI? Or even to successfully judge which new AI
>theories are most likely to succeed?

Do I believe a committee could successfully build an AI? Maybe. But I
don't think it would be a good idea to do it that way.

>If not, why would they be able to do it for Friendly AI?

I never said they could, or should, DESIGN anything. Simply approve
designs. Zoning committees don't build anything, but they are important to
maintain order in a metropolitan area. I believe a Singularity Committee
(or whatever it should be called - I'd like to avoid the term "committee")
would be a very useful asset to the human race. Although I can see where
it could easily be seen as a detriment to will-full, single-minded, solo
players or even like minded teams. Individual accomplishment is irrelevant
in light of the Singularity, successful completion of the project in the
safest manor possible is the only rational goal.

I believe your goal, Eliezer, is to make the Singularity as friendly and
safe as possible, is it not? If so you should welcome such a committee as
a way to ensure that the safest and most friendly design is the one
launched. You should under no circumstances fear such a committee since,
if you really are destined to engineer the Singularity, the committee would
certainly concede that your design was the best when it was presented to them.

>Sometimes committees are not very smart. I fear them.

I don't like committees either, and I can understand why you, in
particular, would fear such a committee. It would take away your ability
to single handedly, permanently alter the fate of the human race. Which is
exactly why such a committee would be a good thing. Such decisions are too
big for any one person to make.

If you were on trial for murder and up for the death penalty, would you
want one single person to decide your fate or a jury of people?

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT