From: Eliezer Yudkowsky (firstname.lastname@example.org)
Date: Fri Jun 04 2004 - 15:22:21 MDT
David S. Hansen wrote:
> With the recent SIAI-bashing on sl4, I might as well jump into the fray.
> My prediction:
> SIAI will self-destruct. It will fail to attract substantial funding.
> It will lose in the race to the Singularity. SIAI will still be working
> on movie website tie-ins, will still be marginally funded by college
> students and the occasional eccentric wealthy individual, and Eliezer
> will still be hammering out his friendliness theory with absolutely no
> working code in the year 2025 - about the time a generic bayesian
> knowledge network bootstraps itself into superintelligence.
> 3. Awareness Efforts. Admittedly, I was a little excited about the
> flurry of activity that Tyler initiated, as I got the impression that
> SIAI was finally getting serious about promoting itself. This is what
> caused me to begin donating and helping in the first place. After a
> while, however, it became clear to me that it was more of the same.
> Undertaking herculean efforts in order to drive a few more hits to a
> website. Bringing on a 20-year-old with no significant network to go
> fundraising in silicon valley. Wow. What SIAI seems to not understand
> is that when one undertakes a serious round of fundraising, one must
> approach potential serious investors with either credibility or a
> product - and for reasons noted above, this is something SIAI does not
I don't claim to know what I'm doing on the fundraising side of things.
What I know is that restraining my annoyance was getting on my nerves
sufficiently to start seriously hurting, and there was no apparent marginal
benefit (it's not like SIAI was busily taking off on the wings of people
vastly impressed with my self-restraint), so I gave up and decided to
transform my public image into something I could sustain without going
nuts. I.e., a mad scientist. My prediction is that despite all the claims
of glooming and dooming, in two months everyone will forget that I was ever
anyone else. It's only the shock of the change that leads people to
believe anything significant has happened. I do not believe that SIAI is
marginally disadvantaged by the marginal change.
Call this my settling in for the long haul, choosing a mode of existence in
which I can be sustainably, if not happy, then at least not pointlessly
If you think the Earth is doomed, step up and save it. I think I can
figure out how to solve the technical question of FAI. That's all I ever
claimed to be good for. SIAI will continue to be composed of those people
who are willing to step up to the plate, not lecture us on what we're doing
wrong. If that means a 20-year-old has to grow into the challenge, so be
it. He's here, he's willing to try, it's one more person to stand on the
battlefield so humanity doesn't go down without a fight. This
distinguishes him from all the hypothetical better people who aren't here
and helping. It is incremental progress. You want it to go faster? Get
behind and push. You want to be in a position to tell me to shut up for PR
reasons? Get involved enough and it could happen. Heck, if you were a
current donor I'd be paying a lot more attention to you right now. If you
don't like the way Earth's destiny is developing, you are welcome to do
what you can to change it. If SIAI had enough funding to hire one of those
professional fundraisers, he'd be in a position to tell me to shut up.
You see the problem? You solve it. Yes, you, personally. Don't plan on
anyone doing it for you.
Meanwhile, I will try and remember that even if everyone calls me arrogant,
this is not who I actually am; I am a mad scientist. It will be harder for
me to remember the more time people spend calling me arrogant. If you all
told me I was sweetness and light, I'd probably get sweeter. Bear that in
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT