From: Marc Geddes (firstname.lastname@example.org)
Date: Sun May 23 2004 - 23:48:58 MDT
--- Eliezer Yudkowsky <email@example.com> wrote: >
Eliezer Yudkowsky wrote:
> > I am not saying that you will end up being stuck
> at your current level
> > forever. I am saying that if you tried
> self-improvement without having
> > an FAI around to veto your eager plans, you'd go
> splat. You shall write
> > down your wishlist and lo the FAI shall say: "No,
> no, no, no, no, no,
> > yes, no, no, no, no, no, no, no, no, no, yes, no,
> no, no, no, no." And
> > yea you shall say: "Why?" And the FAI shall say:
> > Someday you will be grown enough to take direct
> control of your own
> > source code, when you are ready to dance with
> Nature pressing her knife
> > directly against your throat. Today I don't think
> that most
> > transhumanists even realize the knife is there.
> "Of course there'll be
> > dangers," they say, "but no one will actually get
> hurt or anything; I
> > wanna be a catgirl."
> Just in case it is not clear, I do not think I am
> grown enough to mess with
> my own source code. At best I am grown enough to be
> scared away in my own
> right, not just because an FAI tells me it would be
> a bad idea.
> Eliezer S. Yudkowsky
> Research Fellow, Singularity Institute for
> Artificial Intelligence
I'm not sure why you have suddenly started claiming
that human self-modification is very problematical.
I do agree that it would be exceedingly complex and
dangerous to try to actually eliminate the *capacity*
for experiencing specific qualia. But as far as I can
make out, all of the problems people have with
specific qualia come about not because of the *qualia*
per se, but the *reaction* to it. Addiction,
distraction, obsession, frustration etc these are all
a consequence of the *desire* for qualia. For
instance a person may be addicted to chocolate. But
the problem is not the ability to taste chocolate per
se, but the *desire* to taste it. So we don't need to
eliminate the *capacity* for qualia to fix the
problem. We just need to adjust the *desire* factor.
And there are reasons for believing that adjusting
*desire* is actually quite simple. There is a known
medical condition where a person has full sensory
experience, but they don't care about it in the
slightest. For instance when a person with this
condition is given pain, they are *aware* of the pain,
but it doesn't perturb them. So it seems that
attraction and repulsion to specific qualia is totally
independent of the *capacity* for the qualia. Other
evidence: Drugs that eliminate addiction. Hormones
that easily turn up and down a personís sex drive etc.
I see no reason why we need try to eliminate the
*capacity* for specific qualia. If we don't like
certain qualia, all we have to do to turn down our
*desire* for it, and it should no longer bother us.
Sure, that will lead to some inefficiency if the
qualia is no longer useful to us when we become
post-human, but it shouldn't be a problem. It just
means that the software encompassing our mind will
have to be treated as a legacy system.
And although there is reason for thinking that trying
to eliminate the capacity for specific qualia would be
exceedingly complex and dangerous, the converse is not
true. There is no reason for thinking that ADDING the
capacity for new kinds of qualia would be
problematical. Since the mind has the capacity for
learning, it should be able to adjust to any new kinds
of capacity quite easily.
"Live Free or Die, Death is not the Worst of Evils."
- Gen. John Stark
"The Universe...or nothing!"
Please visit my web-sites.
Science-Fiction and Fantasy: http://www.prometheuscrack.com
Science, A.I, Maths : http://www.riemannai.org
http://personals.yahoo.com.au - Yahoo! Personals
New people, new possibilities. FREE for a limited time.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT