From: maru (marudubshinki@gmail.com)
Date: Sat Jan 22 2005 - 18:28:46 MST
This is an issue of expected utility, is it not? The payoff for a
friendly Seed AI is orders of magnitudes higher than that for a
multi-million dollar jackpot (after taxes). I consider the chance of
Yudkowsky et al. succeeding to be considerably better (but still not
acceptable) than the billions-against lottery chances. Add in the higher
return, the expected utility is considerably higher for donating to the
Institute than for a lottery ticket.
~Maru
Slawomir Paliwoda wrote:
> Eliezer's argument seems self-defeating unless he thinks the odds of
> creating AI, let alone FAI, by SIAI's team are better than the odds of
> winning lottery. Otherwise, it would be irrational to tell people not
> to buy lottery tickets while requesting support for SIAI's project to
> build AI that has similarly poor odds of success. If we assume this is
> consistent argument, the difference between a plain lottery ticket and
> SIAI's version of a lottery ticket must lie in the perception of their
> respective odds, with the latter "ticket" having much better chance of
> winning than the former. So the question is very clear. What kind of
> odds does SIAI think it has of succeeding, and beyond that, what is,
> in general, the odds or probability threshold above which it would be
> rational to buy a type of lottery ticket? 1/10000, 1/1000, 1/100, 1/10?
>
> Slawomir
>
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:51 MST