Re: How hard a Singularity?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 26 2002 - 13:59:33 MDT


James Higgins wrote:
>
> Actually, I was thinking about this earlier, glad you asked.
>
> I think the best solution would be to assemble a board of maybe 10
> people. These people should be intelligent, but not necessarily
> geniuses. Some should be experts on AI, but not all. I would say the
> criteria ALL members must posses would be:
>
> 1. A strong desire to see the Singularity occur
>
> 2. Strongly value human life and the survival of the human race
>
> 3. Must be willing and able to accept that a solution, other
> than their own, is a better solution
>
> The deciding body SHOULD NOT have exactly coinciding interest. They
> should not, under any circumstances, all be working on the same project
> (such as the Singularity Institute).

James,

Do you believe that a committee of experts could be assembled to
successfully build an AI? Or even to successfully judge which new AI
theories are most likely to succeed?

If not, why would they be able to do it for Friendly AI?

Sometimes committees are not very smart. I fear them.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT