From: Lee Corbin (lcorbin@rawbw.com)
Date: Sun Apr 20 2008 - 08:11:03 MDT
Stuart writes
[Unknown attributions]
>> > > I guess my question would be why you value your life more than your
>> > > copy's, if you are both identiical aside from a few neurons here and
>> > > there? They both should have roughly the same value to you.
>
> To answer your question from my personal perspective: My personal code
> of ethics is what I believe would be better for the world. My actual
> behaviour is more complicated, parasited by actions that are better
> for me personally.
I totally understand. I tried for probably two decades to combine into
a single system what would be called "what I really want". But the parts
"what seems to be best for the world" and "what is best for Lee"
stubbornly refused assimilation into a single consistent belief system.
There were just too many thought experiments that could powerfully
drive a wedge between my idealism and my selfish concerns.
For example, two choices: "A: save all your family and friends
from instant death, or else B: save all the people living in Burma
from instant death." Clearly choice B is preferable on any idealistic
basis, and indeed only *ignorance* keeps me from choosing B.
If I had a decent (shall we say "familiar") knowledge of all the
lives, all the joys and sorrows, and vital experiences of the Burmese,
then I could not stand to see those millions perish for the sake of
a few dozen people I happened to know about. But indeed,
I gave up on B, and submitted to what I truly wanted. I wanted
to make choice A, and I would make choice A.
Finally, as I say, I just gave up and admitted that there were two
spheres of ethics that I embraced. I did try to get them to overlap
whereever possible, but still, I had to recognize their separate
existence. My seeking of consistency throughout the entire space
of possible choices---possible thought experiments---could only
be successfully carried out within each sphere.
> Therefore, after all these arguments, I believe that there would
> be no ethical problem with destructive teleportation with instant
> destruction. I'm also pretty convinced that teleportation with a
> ten second delay before destruction is fine, maybe even if the
> "original" is conscious and aware during that time.
The emphasis of the problem of personal identity has ALWAYS
been upon what an MSI (Most Selfish Individual) would do. It
does not help to conflate the issues of ethical behavior and
personal survival in order to understand either one.
Were I an MSI, I would still unhesitatingly teleport, lose some
weeks' memories, be replaced by a frozen copy made weeks
ago, choose my own instance's death over my copies death---
any or all of these if the price were right, e.g., $10M. And I
think it would be foolish for anyone to give up the $10M
on account of unmeasurable "philosophic" or "soul-like"
differences.
> So, if I was an MP [Member of Parliament], I would probably
> vote to allow most forms of destructive teleportation if the issue
> came up - but would be terrified of trying them out myself.
"Vote to allow"? How jolly condescending of you! I hope that
you would also "vote to allow" people to take their own lives
(with the proper filling out of forms, at least, so that murder or
temporary insanity could be ruled out). I also would hope that
you would vote to allow people to ingest whatever substances
they chose, provided it resulted in harm to no one else. Dear
Stuart, sorry for my abrasive and aggressive phrasing and tone,
but this is my morning personality that likes to cut through
straight to the issues :-) give no quarter, and speak very frankly.
Best personal regards as always,
Lee
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT