Re: Economics of cryonic suspension (was: Re: Acceptance of death)

From: Keith Henson (hkhenson@rogers.com)
Date: Tue Dec 07 2004 - 18:54:05 MST


At 06:46 PM 07/12/04 +0100, Sebastian Hagen wrote:
>Keith Henson wrote:
>>It is still strange and uncomfortable enough that only a minority of
>>those who consider themselves "transhumans" have taken steps to have it
>>apply to themselves or their family.
>I'm a transhumanist (calling myself a transhuman would simply be
>unrealistic) who hasn't signed up for cryopreservation because of the
>financial issues.

Do you know what cryopreservation costs? Most people fund it with life
insurance. If you are reasonably young, the cost is under a dollar a day.

>Unless you assign different (intrinsic) values to different human lives (a
>distinction that may be justifiable, but which I'm definitely not willing
>to make with my current biases), saving one human life is equivalent to
>stopping dieing in general (e.g. with a Friendly Singularity) less than
>one second earlier than it would happen otherwise.

This sort of makes sense from an inclusive fitness perspective.

>Considering the current prices for cryopreservation of a single person, a
>donation of the same amount of money to a promising (and not already
>overfunded) Friendly Singularity researching organization is in my opinion
>likely to cut their research time by at least one second without
>compromising Friendliness.

If you *have* the money that's probably the case. I don't know how much
typical insurance payments for cryonics patients would help.

>Considering that reaching a Friendly Singularity earlier also (everything
>else being equal) decreases the chance of an existential disaster
>happening in the meantime makes the second alternative look like an even
>better option.
>
>Which of the actions you assign a higher expected utility to depends among
>other things on your estimation of the involved probabilities (e.g. that
>cryopreservation saves sufficient information, that the cryopreservation
>team will get to the body in time, that any of the (not overfunded)
>organizations you are aware of is likely to be able to trigger a Friendly
>Singularity, etc.).

True. You might feel the need to factor in the chances (despite their best
efforts) of a Friendly Singularity organization accidently screwing up.

>And, obviously, if you really value the life/identity of certain persons
>over that of a bunch of (probably) total strangers it makes sense to try
>to save the former at the cost of the latter.

Your genes have been selected for this bias: You should value yourself as
much as two children, or two brothers, or 4 cousins or 8 second
cousins. (See William Hamilton) Since we evolved in tribes with an
average relationship about that of second cousins, our genes probably
overvalue those we know as if they were tribe members. And while we don't
share a lot of relationship with total strangers, there are an awful lot of
them to weigh in the balance.

Of course, if you have substantial income, you can afford to donate to
Friendly AI work and get yourself frozen if you need it.

If you think you might want cryonic suspension if you needed it, *at least*
get enough inexpensive life insurance to fund it. You can always assign
the insurance proceeds to SI if you are lost at sea.

Keith Henson



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT