From: Eliezer Yudkowsky (firstname.lastname@example.org)
Date: Sun Jan 23 2005 - 04:36:36 MST
Slawomir Paliwoda wrote:
>> Most lottery players know the expected value of a lottery ticket is less
>> than the price they pay for the ticket, but they play anyway because
>> they like the psychological drama of it.
> They play because they are not strong enough in the Way to feel emotionally
> the true meaning of the words, "Odds of a hundred million to one". They
> have not learned to translate mere dry statistics into a feeling of
> absolute and utter certainty, far exceeding any proposition of science or
> everyday life, that they SHALL NOT win the lottery. If you understand
> that, there is no psychological drama.
> Eliezer's argument seems self-defeating unless he thinks the odds of
> creating AI, let alone FAI, by SIAI's team are better than the odds of
> winning lottery. Otherwise, it would be irrational to tell people not to
> buy lottery tickets while requesting support for SIAI's project to build
> AI that has similarly poor odds of success.
Undoubtedly some will reply, "Given the potential payoff, you should buy
that lottery ticket even if the odds were a billion to one." (Since what's
wrong with the lottery ticket isn't just the tiny odds, but that the payoff
doesn't match the tiny odds, especially when the diminishing marginal
utility of money is taken into account. Otherwise, there'd be an industry
in millions of people clubbing up to buy a sufficient number of lottery
tickets, then dividing the proceeds.)
I do not answer in this way, because I am on the inside, not the outside,
of SIAI. I do not think that a billion-to-one chance of humanity surviving
is acceptable. I do not think that a hundred-to-one chance is acceptable.
I would not be investing my effort in SIAI unless I thought that, given
some set of actions cognitively reachable to me, the odds could reach at
least one-in-ten and preferably five-in-ten. I may not regard these odds
as fixed because they depend on my actions.
> If we assume this is
> consistent argument, the difference between a plain lottery ticket and
> SIAI's version of a lottery ticket must lie in the perception of their
> respective odds, with the latter "ticket" having much better chance of
> winning than the former. So the question is very clear. What kind of
> odds does SIAI think it has of succeeding,
I think there's a way for me to get the odds to at least one-in-ten, if I
make the right choices; nor are your own choices irrelevant. If a billion
to one isn't good enough for you, get out and push, or die; those are your
> and beyond that, what is, in
> general, the odds or probability threshold above which it would be
> rational to buy a type of lottery ticket? 1/10000, 1/1000, 1/100, 1/10?
In general, the answer from expected utility is that it depends simply on
the lottery payoff and on the payoff of other available investments. The
only reason I complicate the answer beyond that is because I'm a human,
with aspirations and emotions and so on, and to me there is a qualitative
difference between being satisfied with a choice and needing to try harder.
(Under expected utility maximization, you always try harder.)
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT