From: Christopher Healey (CHealey@unicom-inc.com)
Date: Sat Feb 18 2006 - 14:33:05 MST
I was one of these potential donors that had contacted Tyler about making up any shortfall.
After sending my inquiry to him, I realized that taking this position had more to do with my perception ( read: emotional reward) of doing the most good, rather than doing the most good *in reality*. The latter is my ultimate goal.
I have just placed my donation, in the maximum amount I would have been able to match if a shortfall was in the cards. In other words, if I can donate it under a Challenge shortfall, I can donate it under any conditon, and so I have.
If you're one of the others who contacted Tyler, or are somebody who's been thinking of donating (but *really* want to feel you're achieving maximum impact), I'd ask you each to reassess your position in this light.
From: firstname.lastname@example.org on behalf of Tyler Emerson
Sent: Sat 2/18/2006 3:35 PM
To: email@example.com; firstname.lastname@example.org
Subject: RE: Reminder: SIAI Challenge expires this Sunday | SIAI February news
I've had some potential donors say they're waiting to donate at the end to
help cover any remaining amount. If you *can* contribute, please don't wait.
Please recognize the Prisoner's Dilemma effect here: everyone waiting to
give at the end on Sunday to match what they expect will be a small amount,
which causes a large amount to be left, which discourages waiters from
giving at all, leaving the Institute with a large amount of unmatched funds.
Tyler Emerson | Executive Director
Singularity Institute for Artificial Intelligence
P.O. Box 50182 | Palo Alto, CA 94303 U.S.
T-F: 866-667-2524 | email@example.com
www.intelligence.org | www.singularitychallenge.com
> -----Original Message-----
> From: Tyler Emerson [mailto:firstname.lastname@example.org]
> Sent: Friday, February 17, 2006 4:33 PM
> To: 'email@example.com'
> Subject: Reminder: SIAI Challenge expires this Sunday | SIAI February news
> The Singularity Institute's 2006 $100,000 Challenge with Peter Thiel, the
> former CEO of PayPal, will expire this Sunday, February 19. Any donation
> make will be matched dollar-for-dollar. So far, SIAI has matched $93,486.
> You can donate and track our progress here:
> Personal checks postmarked by Sunday will be matched. If you send a
> by check, please let us know so we can keep the donation total accurate.
> Matching the Challenge is really crucial for our growth. Below is a
> of our present projects, which I hope gives you a sense of our dedication.
> Your tax-deductible Challenge gift will support:
> * The production and promotion of the Stanford Singularity Summit, a
> conference to educate up to 1700 people in Silicon Valley about the
> singularity hypothesis - representing an unprecedented chance to promote
> further the nascent fields of singularity and global risk studies. A
> well-executed conference will increase the interest of gifted students,
> raise the legitimacy of the research, and expand the range of investors.
> Summit homepage teaser:
> * Our second full-time Research Fellow. We are looking for someone
> exceptional to collaborate with Yudkowsky on the challenge of a workable
> theory for self-improving, motivationally stable Artificial Intelligence.
> * A remarkable Development Director and Communications Director. I'm now
> looking for skilled and dedicated collaborators to scale the Institute.
> * Our forthcoming monthly speaker series, the Future of Humanity Forum, at
> Stanford Hewlett Teaching Center (its main lecture hall seats 500). The
> Forum will be an ongoing series to complement and expand on the
> Summit. Date and time for the inaugural event will be announced shortly.
> * Yudkowsky's Friendly AI theory and design work, conference
> and published writing. He completed recently his two chapter drafts for
> Bostrom and Milan Cirkovic's Global Catastrophic Risks (forthcoming): the
> first on cognitive biases potentially affecting judgment of global risks,
> the second on the unique global risks of Artificial Intelligence.
> * The Singularity Institute Partner Network. Later this year, we'll begin
> approaching potential inaugural partners to be our cornerstone for
> a network of companies, foundations, individuals, and organizations
> committed to advancing beneficial AI, singularity, and global risk
> * Medina's academic presentations. In January, Medina was awarded full
> financial support to attend a workshop on Bayesian inference,
> statistics, and machine learning at the Statistical and Applied
> Sciences Institute in North Carolina. One of the most popular academic
> conferences on the interdisciplinary study of the mind, Tucson VII -
> a Science of Consciousness, has accepted his proposal for a talk on the
> ethics of recursive self-improvement in April. He will also speak on
> Artificial General Intelligence ethics at AGIRI's first workshop on moving
> from narrow AI to AGI, and at Stanford Law School on a new problem for
> personhood ethics in light of human enhancement technologies, both in May.
> * Our organizational identity and website overhaul, after the Summit.
> Further details:
> If you aren't familiar with our work, please see:
> What Is the Singularity?
> Why Work Toward the Singularity?
> Additional news:
> * Yudkowsky will give a talk on February 24 at the Bay Area Future Salon
> SAP Labs (attendance ranges from 70-100), to discuss the implications of
> recursive self-improvement for Friendly AI implementation, and the unique
> theoretical challenge that recursive self-improvement poses.
> 6:00-7:00PM (networking, refreshments), 7:00-9:00PM (talk, discussion)
> SAP Labs, Building D
> 3410 Hillview Avenue
> Palo Alto, CA 94304
> * Peter Thiel has joined The Singularity Institute's Board of Advisors:
> Comments are welcomed on the Summit teaser. Note that "What Others Have
> Said" has some known headshot-display and text issues that will be fixed.
> Thank you to everyone helping with the Challenge!
> Tyler Emerson | Executive Director
> Singularity Institute for Artificial Intelligence
> P.O. Box 50182 | Palo Alto, CA 94303 U.S.
> T-F: 866-667-2524 | firstname.lastname@example.org
> www.intelligence.org | www.singularitychallenge.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT