From: David Picon Alvarez (firstname.lastname@example.org)
Date: Tue Dec 27 2005 - 04:13:28 MST
From: "Phillip Huggan" <email@example.com>
> What if destroying us is not similiar to database deletion? What if our
crappy tools we use to resolve our "consciousness data" (whatever that
means) aren't good enough to capture the physics of what makes us sentient?
What if the process of "transfering" (how exactly?) our personal identities
to backup hard-drives and then back, results in killing off our personal
identies? I'm pretty sure our consciousness is an emergent property of the
physics of our central nervous systems; probably EM interactions of some
sort. We are smart enough to and should override the naturally occuring
mistake of assuming Strong-AI is responsible for our sentience (it isn't.
If it were, how come everyone isn't psychic?). Of course, if the hard-drive
backups utilized a Type VII civilization multiverse generator
thing-a-ma-jing of the kind described in another sl4 thread a few days ago,
it's a whole new ball of wax.
Please explain that. Why would people be psychic if strong AI hypothesis
were correct? I'm assuming you use the word psychic here in its everyday
meaning, of having extraordinary powers. To my mind, asking how come people
aren't psychic if strong AI is correct is the same as asking why computers
cannot read each other's states, id est, I don't get what link there is, if
any, between one thing and the other.
A strong AI supporter,
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT