From: Patrick McCuller (patrick@kia.net)
Date: Tue Feb 22 2000 - 16:46:00 MST
From: "Patrick McCuller" <patrick@kia.net>
I'm nitpicking today.
> From: Marc Forrester <SL4@mharr.force9.co.uk>
>
> Moore's law is the rate at which computers advance in the absence of major
> paradigm shifts, such as replacing valves with transistors, or integrating
> transistors into microcircuit chips. The next shift will be from single
> central processing units to massive parallel-processing on one chip,
> the question is whether this happens before the arrival of strong AI,
> or after it. I think the global entertainment industry has the
> resources to make it before. Call it a 50/50 chance.
In his book 'The Age of Spiritual Machines', Kurzweil makes a pretty good
case that Moore's law describes a curve that describes the amount of
computational power available for fixed money at any given time. He fits
Moore's law onto a graph going back a century, to human computers and so on.
There are - I don't have the book to reference - at least three paradigm
shifts. One heartening thing is that the graph is actually better than
Moore's law, though not by much.
We could call it Kurzweil's law.
> > Who's to say an AI won't have a hundred
> > million generations of descendants?
>
> Well, therein lies the singularity stuff. For that to happen, the
> generations need a world millions of times faster than our physical
> one to grow up in, and our technology is nowhere near being able to
> provide such things. Theirs could be, but if it is they hardly
> need worry about oppression from the likes of us. :)
They don't really need to be millions of times faster. There's plenty of
time left in the universe even at our rate. I'm sorry, this is a nitpick.
>
> Pain is any sensory qualia that induces a feeling of suffering.
>
> Suffering is something else entirely, and it is quite separate.
> Minds can suffer without pain, (Although human minds tend to
> experience hallucinatory pain as a result of such suffering,
> another result of our long and messy evolution) and they can
> experience pain without suffering.
This leaves open the possibility that suffering is itself qualia.
>
> Pain, as the mechanism through which a creature suffers as a
> direct result of injury, is not necessary, and may be discarded,
> if physical injury is nothing more than an inconvenience.
I do indeed concur. Though I feel that the possibility of physical
destruction, no matter how remote, can never be eliminated. A supernova
could take out anything. Falling into a deep enough gravity well could take
out anything (actually, I'm trying hard to remember where I read a theory of
definite information loss at event horizons; you can pull mass out of a
black hole but you can't get 'ordered' matter out.) I probably can't
accurately predict what would be lethal to a post-singularity being, but I'm
certain lethal events would exist. Therefore an awareness of injury and
physical danger will be a part of any long-lived intelligent being, whatever
the qualia involved.
> It seems that this god-play may lead us to find answers to the
> ancient theological question of suffering. (Not that we claim to
> be omnibenevolent, of course.) It may well prove to be necessary
> for an intelligent mind to suffer, in order that it might have
> desires. Or it may prove possible to create minds that never
> suffer, but are simply more euphoric at some times than others.
I think what you're describing is called the soul making theodicy, and
though I hate to keep quoting books at you, Martin's 'Atheism: A
Philosophical Justification' makes the case that suffering isn't necessary
to build character or to motivate desires. He postulates that even in a
world free from pain, people can still be generous with time or love. And
even if there is no suffering, there must always be a happiness gradient -
you could always be happier. The world could always be better.
In application to AI, conditions will never be perfect. An AI will take
actions and attempt to solve problems, and sometime fail. Eliezer has an
elegant demonstration of goal formation in a state in which there are no
pressing problems to attend to.
http://singularity.posthuman.com/tmol-faq/logic.html
> > Philosophy aside, intentionally damaging an AI's
> > cognitive processes would constitute torture.
>
> That doesn't sound right.. If someone opened up my brain and
> intentionally damaged my cognitive processes I'd certainly call
> that grievous assault, but torture? That would apply more to
> the physical process of sawing my head open. Then again, if I
> was aware of the whole thing I doubt I'd be comfortable with
> the experience whatever the physical pain involved.
Messing with cognitive processes can be torture. Imagine total paranoia.
Imagine having panphobia or even panophobia! Imagine having Capgras'
syndrome or even Cotard's syndrome. (People with Capgras' syndrome often
believe that their loved ones have been replaced by exact duplicates:
robots, aliens, clones, whatever. This is a result of brain damage that
affects the association of visual perception with emotional response. People
with Cotard's syndrome, which I gather is immensely rare, lose their ability
to associate or remember associations of emotional response with *anything*,
even themselves. They usually believe that they are dead. Apparently some
even hallucinate their own flesh rotting away.)
>
> Perhaps torture is the act of forcing an experience on a creature
> against its will, with the intent to cause suffering. That way it
> doesn't matter what pain and suffering actually mean from the
> victim's perspective, it is the intent to cause it that makes
> the act one of torture.
The victim's perspective has to be a part of it, otherwise torture can be a
victimless crime, which I don't believe in for all sorts of moral reasons.
(The victim's perspective can't be the entire matter either.) We could
postulate that torture isn't necessarily a crime; but unless it is
self-inflicted, that doesn't make a lot of sense either.
Though even arguing this opens up huge avenues. For instance, it may be
possible to kill someone without their experiencing pain - or even anything
at all. Sudden enough death would preclude any kind of realization. This
wouldn't be torture, but it would still be wrong.
I have a feeling the whole thing is semantics.
Patrick McCuller
------------------------------------------------------------------------
Get what you deserve with NextCard Visa! Rates as low as 2.9%
Intro or 9.9% Fixed APR, online balance transfers, Rewards Points,
no hidden fees, and much more! Get NextCard today and get the
credit you deserve! Apply now! Get your NextCard Visa at:
http://click.egroups.com/1/913/3/_/_/_/951262900/
------------------------------------------------------------------------
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:06 MDT