From: Thomas Buckner (tcbevolver@yahoo.com)
Date: Fri Jun 04 2004 - 17:54:02 MDT
--- "David S. Hansen" <haploid@haploid.com> wrote:
> With the recent SIAI-bashing on sl4, I might as well
> jump into the fray.
>
> My prediction:
>
> SIAI will self-destruct. It will fail to attract
> substantial funding.
> It will lose in the race to the Singularity. SIAI
> will still be working
> on movie website tie-ins, will still be marginally
> funded by college
> students and the occasional eccentric wealthy
> individual, and Eliezer
> will still be hammering out his friendliness theory
> with absolutely no
> working code in the year 2025 - about the time a
> generic bayesian
> knowledge network bootstraps itself into
> superintelligence.
>
> 1. Ben's points about the arrogance of Eliezer and
> Michael are *very
> much* relevant to the ability of SIAI to meet its
> goals. There are a
> couple orthogonal reasons for this:
>
> 1a. Arrogance tends to close one's mind to the
> consideration of outside
> ideas. An idea monoculture can and usually does
> severely limit an
> organization's ability to succeed. Ackoff and other
> organizational
> systems theorists have demonstrated this
> extensively.
>
> 1b. Arrogance, even if it is "correctly-placed" and
> "deserved", often
> limits an organization's fundraising capacity.
> Acting like a complete
> ass is one very effective way of dissuading people
> from allocating
> resources in the direction of said ass. This effect
> is *significantly*
> amplified if the principals in question have no
> external claim to the
> basis of their arrogance( e.g. high honors,
> doctorate degrees, industry
> success, etc ). Serious investors do not buy into
> an idea on faith alone.
>
> 1c. SIAI's apparent mistrust of "outsiders", as
> demonstrated in the
> oft-repeated claims that LOGI is obsolete and there
> is something much
> better but it can't be published because evil people
> will implement it
> before we get friendliness worked out. This
> position has the direct
> effect of discouraging talented potential seed AI
> programmers from
> becoming interested in the project. It also has the
> direct effect of
> discouraging outside investment due to disclosure
> and control concerns.
> It also has the direct effect( and I believe Ben
> has already noted
> this )of discouraging other AGI researchers from
> contributing ideas
> toward the development of SIAI's theory and project,
> e.g. "why bother if
> what they've published is useless anyway?"
>
> 2. Not only is there a complete lack of theoretical
> cohesiveness
> amongst AGI projects( e.g. Eliezer and Ben ), but
> there is a complete
> lack of theoretical, structural, and motivational
> cohesiveness amongst
> the principals of SIAI itself. I don't think a day
> goes by( recently
> )during which Eliezer doesn't correct Tyler or
> Anissimov on at least one
> very critical point. Acknowledging the fact that
> *correctness* is
> important to maintain, I would argue that when
> attempting to gain public
> interest or outside funding, this is a critical
> flaw. Seasoned public
> relations professionals know that if one doesn't
> know the answer, one
> doesn't make it up. When attempting to secure
> funding, one doesn't
> present to the public a group of highly disconnected
> individuals.
>
> 3. Awareness Efforts. Admittedly, I was a little
> excited about the
> flurry of activity that Tyler initiated, as I got
> the impression that
> SIAI was finally getting serious about promoting
> itself. This is what
> caused me to begin donating and helping in the first
> place. After a
> while, however, it became clear to me that it was
> more of the same.
> Undertaking herculean efforts in order to drive a
> few more hits to a
> website. Bringing on a 20-year-old with no
> significant network to go
> fundraising in silicon valley. Wow. What SIAI
> seems to not understand
> is that when one undertakes a serious round of
> fundraising, one must
> approach potential serious investors with either
> credibility or a
> product - and for reasons noted above, this is
> something SIAI does not have.
>
> --
> o David S. Hansen
Some of your points seem valid to me, David, but I
think you do not understand that this is no normal
investment! If the SIAI effort fails and UFAI results,
it's razor blades all around. If it succeeds, the
investors will not get their money back. They will
live in a world so transformed that economics as they
knew it is without meaning. What will money mean in a
world where you can go out in the yard and say, "I'd
like a Bugatti" and it assembles itself out of the
dirt?
Tom
=====
__________________________________
Do you Yahoo!?
Friends. Fun. Try the all-new Yahoo! Messenger.
http://messenger.yahoo.com/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT