RE: SIAI will self-destruct

From: Ben Goertzel (ben@goertzel.org)
Date: Fri Jun 04 2004 - 18:16:27 MDT


David,

I think you have some valid points about flaws in SIAI's current
approach to fundraising. IMO, to succeed with their current approach,
they'll have to get pretty lucky.

However, my hope is definitely NOT that SIAI self-destructs; my hope is
that both SIAI and Eliezer "grow up" in a relatively graceful and
productive way.

I have some problems with Eliezer's current attitude (not just his shiny
new "bad-boy mad scientist" persona, I mean his current *intellectual*
attitude), and somewhat bigger problems with Michael Wilson's current
attitude.

And I think the "estimated collective volition" approach to Friendliness
is either badly incomplete or badly flawed.

However, I think that if SIAI can attract the funds to hire a few of the
right people, AND if at the same Eliezer's thinking and personality
mature in the right direction, then SIAI will be able to do some great
stuff. Even if they don't succeed in launching the Singularity, they
could make major contributions to the process.

I'm not going to make any estimates the odds of SIAI self-destructing,
flourishing and growing, or remaining roughly in a steady-state. The
point of this email is just to say that, in spite of my quite honestly
expressed criticisms, I hope very much for Eliezer & SIAI to grow in a
positive direction.

-- Ben G

>
> With the recent SIAI-bashing on sl4, I might as well jump
> into the fray.
>
> My prediction:
>
> SIAI will self-destruct. It will fail to attract substantial
> funding.
> It will lose in the race to the Singularity. SIAI will still
> be working
> on movie website tie-ins, will still be marginally funded by college
> students and the occasional eccentric wealthy individual, and Eliezer
> will still be hammering out his friendliness theory with
> absolutely no
> working code in the year 2025 - about the time a generic bayesian
> knowledge network bootstraps itself into superintelligence.
>
> 1. Ben's points about the arrogance of Eliezer and Michael are *very
> much* relevant to the ability of SIAI to meet its goals. There are a
> couple orthogonal reasons for this:
>
> 1a. Arrogance tends to close one's mind to the consideration
> of outside
> ideas. An idea monoculture can and usually does severely limit an
> organization's ability to succeed. Ackoff and other organizational
> systems theorists have demonstrated this extensively.
>
> 1b. Arrogance, even if it is "correctly-placed" and "deserved", often
> limits an organization's fundraising capacity. Acting like a
> complete
> ass is one very effective way of dissuading people from allocating
> resources in the direction of said ass. This effect is
> *significantly*
> amplified if the principals in question have no external claim to the
> basis of their arrogance( e.g. high honors, doctorate
> degrees, industry
> success, etc ). Serious investors do not buy into an idea on
> faith alone.
>
> 1c. SIAI's apparent mistrust of "outsiders", as demonstrated in the
> oft-repeated claims that LOGI is obsolete and there is something much
> better but it can't be published because evil people will
> implement it
> before we get friendliness worked out. This position has the direct
> effect of discouraging talented potential seed AI programmers from
> becoming interested in the project. It also has the direct effect of
> discouraging outside investment due to disclosure and control
> concerns.
> It also has the direct effect( and I believe Ben has already noted
> this )of discouraging other AGI researchers from contributing ideas
> toward the development of SIAI's theory and project, e.g.
> "why bother if
> what they've published is useless anyway?"
>
> 2. Not only is there a complete lack of theoretical cohesiveness
> amongst AGI projects( e.g. Eliezer and Ben ), but there is a complete
> lack of theoretical, structural, and motivational
> cohesiveness amongst
> the principals of SIAI itself. I don't think a day goes by( recently
> )during which Eliezer doesn't correct Tyler or Anissimov on
> at least one
> very critical point. Acknowledging the fact that *correctness* is
> important to maintain, I would argue that when attempting to
> gain public
> interest or outside funding, this is a critical flaw.
> Seasoned public
> relations professionals know that if one doesn't know the answer, one
> doesn't make it up. When attempting to secure funding, one doesn't
> present to the public a group of highly disconnected individuals.
>
> 3. Awareness Efforts. Admittedly, I was a little excited about the
> flurry of activity that Tyler initiated, as I got the impression that
> SIAI was finally getting serious about promoting itself.
> This is what
> caused me to begin donating and helping in the first place. After a
> while, however, it became clear to me that it was more of the same.
> Undertaking herculean efforts in order to drive a few more hits to a
> website. Bringing on a 20-year-old with no significant network to go
> fundraising in silicon valley. Wow. What SIAI seems to not
> understand
> is that when one undertakes a serious round of fundraising, one must
> approach potential serious investors with either credibility or a
> product - and for reasons noted above, this is something SIAI
> does not have.
>
> --
> o David S. Hansen
> o haploid@haploid.com
>
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT