Re: Activism vs. Futurism

From: Brian Atkins (brian@posthuman.com)
Date: Sun Sep 08 2002 - 17:03:13 MDT


Ben Goertzel wrote:> hi,
>
>> > Eliezer says:
>> >
>>Well, Ben, this is because there are two groups of people who know damn
>>well that SIAI is devoted solely, firstly, and only to the Singularity,
>>and unfortunately you belong to neither.
>
>
> I understand that the *idea* of SIAI is to promote the Singularity
> generically.
>
> However, the *practice* of SIAI, so far, seems very narrowly tied to your
> own perspectives on all Singularity-related issues.
>
> May I ask, what does SIAI plan to do to promote alternatives to your
> approach to AGI? If it gets a lot of funds, will it split the money among
> different AGI projects, or will it put them all into your own AGI project?
>
> What does it plan to do to promote alternatives to your own speculative
> theory on Friendly AI? (And I think all theories on Friendly AI are
> speculative at this point, not just yours.)
>
> When I see the SIAI website posting some of the views on Friendly AI that
> explicitly contradict your own, I'll start to feel more like it's a generic
> Singularity-promoting organization.
>
> When I see the SIAI funding people whose views explicitly contradict your
> own, then I'll really believe SIAI is what you say it is.
>

I know this thread is dead, but I wanted to say in my capacity as a member
of SIAI's board that I do consider us to be "neutral". However, I think there
is some confusion in the quotes above. Our bylaws state our organization's
purpose quite simply, and this does not involve us treating all Singularity
paths equally (unless the board determined that this was the best way to
achieve our purpose). The board so far has decided that the best way to
achieve our purpose is to evaluate all paths, and pick the most promising
one to pursue. I would note that in the past all three of the board members
have changed their minds at least once regarding what is the best path, so
there is historical precedent for the idea that some day we may run across
something even more promising and switch to that. All you have to do is
convince at least two of us...

We attempt to be neutral when it comes to picking the best path, but we are
not required to fund or promote paths to the Singularity that we consider to
be less ideal /unless/ they somehow decrease risks and/or accelerate the
Singularity without negatively affecting our current main plan of action.

What you can see is that while our organization's purpose is simply stated,
it requires a complex set of decisions and tradeoffs in order to maximize our
progress towards it. What it all comes down to of course is whether or not
the board is capable of making unbiased and rational decisions and tradeoffs...
and I can't prove to you that we are. Which is why this thread is dead.

-- 
Brian Atkins
Singularity Institute for Artificial Intelligence
http://www.intelligence.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT