From: Sebastian Hagen (sebastian_hagen@gmx.de)
Date: Fri Oct 01 2004 - 15:43:34 MDT
Michael Wilson wrote:
> ...expecting there to be a
> notion of desire for goal-seeking agents built into the structure of the
> universe itself is simply a layer confusion (...).
My understanding of the issue was seriously muddled. I still don't
understand it, but at least I don't incorrectly think that I understand
it anymore.
> Regardless, all the people I know of who are actually developing AGI and
> FAI do have subjective morality built into their goal systems, so it's not
> a terribly relevant statement.
I agree. It was mainly an answer (based on a muddled understanding of
the relevant issues) to a specific question.
>>"All-Work-And-No-Fun"-mind may well be among the first things I'd do
>>after uploading, so some of my possible future selves would likely be
>>penalized by an implementation of your suggestions. My opinion on those
>>suggestions is therefore probably biased...
>>
>>I don't think that what I would (intuitively) call 'consciousness' is
>>by definition eudaemonic, but since I don't have any clear ideas about
>>the concept that's a moot point.
>
>
> Argh. Firstly, intuition is worse than useless for dealing with these
> concepts. Secondly, you shouldn't even be considering self-modification
> if you don't have a clue about the basic issues involved. Without that
I wasn't seriously considering it. An FAI shouldn't allow me to do any
self-modifications that would lead to undesirable consequences;
predictions about what I would do after uploading assume that a present
FAI would allow (and, for that matter, not actively discourage) it. If I
uploaded without there being an FAI present, I probably wouldn't play
with internal variables of my mind until I understood enough about the
system to do it reasonably safely.
Thanks for the warning about intuition.
> you might as well adopt the principle of indifference regarding what you
> might do, possibly modified by the opinion of people who have put serious
> thought into it. "All-Work-And-No-Fun" can be interpreted two ways; an
> Egan-style outlook that merely makes leisure undesirable (incidentally
> what else do you care about so much that you want to use your mind to
> achieve?)
Well, I mentioned what it was ("objective morality"), blissfully unaware
that it didn't make any sense.
>>Why? I don't understand why the activities mentioned before the quoted part
>>("humor, love, game-playing, art, sex, dancing, social conversation,
>>philosophy, literature, scientific discovery, food and drink, friendship,
>>parenting, sport") are relevant for the value of human life.
>
>
> What else does human life (as distinguished from say a theorem proving
> program running on your PC) consist of?
Aside from complexity? Nothing important, morally (for some values of
moral) speaking. I didn't (and still don't) assume that there was any
fundamental difference in moral relevance between the two.
>>From the perspective of an agent, that is striving to be non-eudaemonic,
>>(me) the proposed implementation looks like something that could destroy a
>>lot of efficiency at problem-solving.
>
>
> From the perspective of the majority of humanity, why should we care? Being
> less efficient just means that things take longer. Assuming we've dealt with
> the competition problem, what's the hurry?
You shouldn't care. Agents striving to be non-eudaemonic are afaict
currently very rare, and unlikely to get significantly more numerous in
the future. There are plenty of other people with more resources at
their disposal who may perceive your research as a threat; seriously
considering this unimportant a group of potential enemies undoubtedly
isn't the best use of your resources.
As for the moral aspect, if you assume that all eudaemonic agents are
intrinsically morally relevant and all non-eudaemonic ones are not, any
eudaemonic agent striving to become non-eudaemonic will simply be
considered insane. Whether that means that you should try to stop them
from becoming non-eudaemonic, or let them do it, and, assuming they
succeed, then severely limit their rights, is imho a comparatively minor
issue; in any event, seriously considering the opinions of lunatics is
also a waste of resources.
Sebastian Hagen
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:46 MST