From: Nick Tarleton (firstname.lastname@example.org)
Date: Fri Sep 26 2008 - 09:19:50 MDT
On Thu, Sep 25, 2008 at 4:27 PM, Matt Mahoney <email@example.com> wrote:
> Belief in consciousness is universal, as is the desire to preserve it.
> Therefore you will make a copy of your mind, technology permitting. Whether
> that copy actually contains your consciousness or just makes that claim is
> irrelevant to any future observable events.
> (Also, how do the above articles relate to this position?)
"Relevant" or not, I prefer that my consciousness persist. (The articles
make the point that my preferences may involve non-ontologically
primitive or non-natural categories, including ones I don't yet fully know
how to define, like "contains my consciousness".)
> Bostrom does not seem to offer any good alternatives.
> In any case, he implicitly assumes that certain forms of intelligence, what
> he calls eudaemotic (with human-like motivations and "conscious") are
> preferable to other types.
Bostrom prefers eudaemonic agents, as do I, whether or not they're
preferable in some universal sense.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT