From: Sebastian Hagen (email@example.com)
Date: Wed Dec 08 2004 - 11:25:42 MST
Keith Henson wrote:
> Do you know what cryopreservation costs? Most people fund it with life
> insurance. If you are reasonably young, the cost is under a dollar a day.
I've read <http://alcor.org/BecomeMember/scheduleA.html> and
<http://alcor.org/SuspFunding/index.html>, though not anything on the
prices of other cryonics organizations, or for that matter any other
material on life insurance. I don't expect them or their estimations to
be worse than the best offer by more than one order of magnitude, and
they would have to be for my current conclusion to be incorrect on
grounds of insufficient data about this topic.
> If you *have* the money that's probably the case. I don't know how much
> typical insurance payments for cryonics patients would help.
Even if we assume that life insurance has a favorable cost/benefit ratio
for someone, there is still the question of what to do with the money in
the event of their death; i.e. they could also have it donated to a FAI
If they survive until the end of involuntary death or the destruction of
society, whichever happens first, the insurance fees will (for our
purposes) have been wasted in any event.
If they die earlier and use the money to pay for a suspension, they get
a chance at having their identity preserved, and, assuming that society
sorts out the relevant challenges, of being woken up and living for
potentially a very long time.
If they die earlier and have the money donated to the FAI researching
organization, they will probably stay dead; however, the money they give
to the organization may save a lot of other people. Or it may not; it
depends on whether they already have enough money to do what they are
planning, whether the other preconditions for a success are met, etc.
There are lots of unknowns in both options.
> True. You might feel the need to factor in the chances (despite their
> best efforts) of a Friendly Singularity organization accidently screwing
True. However, all of the alternatives known to me look even worse. The
best you can do is to choose the organization that is, according to the
data available, least likely to screw it up entirely while still having
a reasonable chance at succeeding at transhuman FAI.
Getting cryogenically suspended doesn't solve that problem for you; if
society doesn't sort out enough of the current issues to develop ways to
defeat death and retrieve the critical information from frozen brains
before it destroys itself (if ever), no one will wake you up.
>> And, obviously, if you really value the life/identity of certain
>> persons over that of a bunch of (probably) total strangers it makes
>> sense to try to save the former at the cost of the latter.
> Your genes have been selected for this bias: You should value [...]
The biases you describe are the ones that prevent me from just assigning
different inherent undesirabilities to the death of certain specific
persons; since I'm not capable of completely cancelling them with
reasonable effort, I simply avoid these judgments altogether.
> Of course, if you have substantial income, you can afford to donate to
> Friendly AI work and get yourself frozen if you need it.
Then again, you could always donate more to FAI research. It would take
a lot of income to completely overfund all projects which you expect to
have positive outcomes (high P(Friendly_SI)/P(existential_disaster)
ratios). Up to that point, donating more instead of having yourself
suspended remains a viable alternative.
> If you think you might want cryonic suspension if you needed it, *at
> least* get enough inexpensive life insurance to fund it. You can always
> assign the insurance proceeds to SI if you are lost at sea.
That depends on how rational I expect my future self to be about these
matters. Maybe I'll turn into a complete egoist, and make the (from my
current perspective) foolish decision to opt for the suspension even
though the donation would probably save more lives.
Another relevant factor is whether the insurance proceeds be payed out
before the relevant issues are resolved (s.a.), and how much good the
whole sum would do at that point compared to smaller amounts at earlier
points of time.
The only thing I'm aware of that is reasonably likely to make all of
these estimations absolutely, horribly wrong are the
P(Friendly_SI)/P(existential_disaster) ratios. Incorrect estimations for
those could, as you mentioned, lead to a donator actively increasing the
chance for a catastrophe through your resource diversion.
As long as one can exclude that possibility with a sufficiently high
probability, donating to the FAI organization looks (given the prior
assumptions) like the safe course of action to me (the most you have to
lose is your identity, as opposed to that of a pretty high number of
other people) - that doesn't mean that it is better on average, of
course, but from the data currently available to me that also appears to
be the case.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT