From: Slawomir Paliwoda (email@example.com)
Date: Sat Jan 22 2005 - 21:44:48 MST
> This is an issue of expected utility, is it not? The payoff for a
> friendly Seed AI is orders of magnitudes higher than that for a
> multi-million dollar jackpot (after taxes).
Taking into account expected utility of FAI seems like a good way of
justifying the argument that it is rational to support SIAI while it is
irrational to buy lottery tickets until one realizes that all potential
benefits to humanity from having FAI around are entirely dependent on the
successful event actually happening. In other words, a person won't benefit
from FAI any more than from a winning lottery ticket if FAI doesn't get made
and he doesn't win the lottery. In both cases the utility will equal zero.
This makes the issue of expected utility of FAI completely irrelevant to the
question of whether one should support SIAI rather than buy lottery tickets,
if we treat both "winning" events as having virtually no chance of
So Eliezer's argument implicitly tries to convince people not to bother
investing in anything with low probability of success, regardless of the
return value on the investment (otherwise we should be encouraged to buy
lottery tickets), inadvertently making also a good case for not supporting
SIAI since FAI project has also low probability of success in the eyes of
> I consider the chance of Yudkowsky et al. succeeding to be considerably
> better (but still not acceptable) than the billions-against lottery
Can I ask what you are basing this statement on?
> Add in the higher return, the expected utility is considerably higher for
> donating to the Institute than for a lottery ticket.
True, but, like I said, the amount of expected utility is orthogonal to the
This archive was generated by hypermail 2.1.5 : Tue Jun 18 2013 - 04:00:45 MDT