From: Marc Forrester (SL4@mharr.force9.co.uk)
Date: Fri Feb 25 2000 - 06:39:50 MST
From: Marc Forrester <SL4@mharr.force9.co.uk>
Patrick McCuller: Tuesday 22-Feb-00
> I'm nitpicking today.
So long as you don't eat any.
> In his book 'The Age of Spiritual Machines', Kurzweil makes a pretty
> good case that Moore's law describes a curve that describes the amount
> of computational power available for fixed money at any given time.
> He fits Moore's law onto a graph going back a century, to human
> computers and so on. There are - I don't have the book to reference -
> at least three paradigm shifts.
Okay, but that's taking a long-term view over the shallow end of the curve,
it's not too shocking that a chaotic factor might fit in with the underlying
trend there, but as the technology accelerates, I suspect the random/chaotic
nature of the shifts will start to cause more deviation. Does the arrival
of the Internet fit into the curve? What about SETI@home?
>> Pain is any sensory qualia that induces a feeling of suffering.
>>
>> Suffering is something else entirely, and it is quite separate.
>> Minds can suffer without pain, (Although human minds tend to
>> experience hallucinatory pain as a result of such suffering,
>> another result of our long and messy evolution) and they can
>> experience pain without suffering.
>
> This leaves open the possibility that suffering is itself qualia.
This is real nature-of-existence stuff, isn't it?
Hmm. The difference between bain and suffering is that pain is a
sense between body and mind, whereas suffering is something that can
happen internally. Indeed, deprive a mind entirely of sensory qualia
and they will suffer as a direct result, and probably go quite mad,
even cease to be a mind at all.
I suppose it is a qualia in itself, but not of quite the same form
as 'green', for example. 'Green' is one qualia through which the
sense of sight is experienced directly, while 'suffering' is one
through which the mind directly experiences its own state.
Does this suggest that only a self aware mind can suffer?
(Accepting that 'self aware' is a property with fuzzy edges)
> I probably can't accurately predict what would be lethal to a post-
> singularity being, but I'm certain lethal events would exist.
> Therefore an awareness of injury and physical danger will be a part
> of any long-lived intelligent being, whatever the qualia involved.
True enough. Minds must be able to learn, therefore able to change,
and so they will always be ultimately mortal. Does awareness of danger
have to come as a direct sense, though, or is it enough to be able to
look at something through pure information senses and think 'I don't
intend to do that, because I would die'? It should be easy enough to
protect children and animals from accidentally falling into black holes.
> I think what you're describing is called the soul making theodicy,
Sounds like a high school rock band. :)
> Martin's 'Atheism: A Philosophical Justification' makes the case that
> suffering isn't necessary to build character or to motivate desires.
> He postulates that even in a world free from pain, people can still be
> generous with time or love. And even if there is no suffering, there
> must always be a happiness gradient - you could always be happier.
> The world could always be better.
But just to play god's advocate, is the zero point on the happiness
gradient absolute, or will an individual always acclimatise to their
life so that one end of their range of feeling is always experienced
as personal grief, and the other as joy? Is suffering really nothing
more than the gut knowledge that you could be happier than you are?
>> That doesn't sound right.. If someone opened up my brain and
>> intentionally damaged my cognitive processes I'd certainly call
>> that grievous assault, but torture?
>
> Messing with cognitive processes can be torture. Imagine total paranoia.
> Imagine having panphobia or even panophobia! Imagine having Capgras'
> syndrome or even Cotard's syndrome.
Happily beyond my powers of visualisation.. I take the point.
Randomly crosswiring any mind is likely to hurt it horrifically.
On the question of causing accidental suffering, the waters get
very murky indeed. I'd say that the risks you take with another's
well being are at the core of the matter, not the outcome.
>> Perhaps torture is the act of forcing an experience on a creature
>> against its will, with the intent to cause suffering. That way it
>> doesn't matter what pain and suffering actually mean from the
>> victim's perspective, it is the intent to cause it that makes
>> the act one of torture.
>
> The victim's perspective has to be a part of it, otherwise torture can
> be a victimless crime, which I don't believe in for all sorts of moral
> reasons. (The victim's perspective can't be the entire matter either.)
A victimless crime, just because an attempt to torture happens to fail?
I don't know about that.. If someone tries to punch me in the face,
I consider that to be an assault irrespective of how well they actually
manage to aim.
> Though even arguing this opens up huge avenues. For instance, it may
> be possible to kill someone without their experiencing pain - or even
> anything at all. Sudden enough death would preclude any kind of
> realization. This wouldn't be torture, but it would still be wrong.
Seperate issue, I think. If you don't own your own existence,
to continue or terminate it as you will, then what can you own?
That said, I have killed animals to end pointless suffering.
Sometimes you can't know the right thing to do, and you just
have to guess. I hope I never have to choose for a friend,
but I'd make a judgement and live with it. I'd rather get
it wrong than refuse to think.
> I have a feeling the whole thing is semantics.
Possibly. I think we both know torture when we see it,
but we are set up for empathy with human beings and cute
furry animals, not machines. We may need a system of
ethics built from cold hard logic before long.
------------------------------------------------------------------------
Start your own free email group on eGroups.
http://click.egroups.com/1/1884/3/_/626675/_/951486021/
------------------------------------------------------------------------
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:06 MDT