From: Samantha Atkins (samantha@objectent.com)
Date: Wed Sep 18 2002 - 12:21:26 MDT
Eliezer S. Yudkowsky wrote:
> Samantha Atkins wrote:
>
>>
>> I don't think so. If you are having trouble keeping yourself alive
>> because of grossly inadequate nutrition, shelter and very spotty water
>> supplies of poor quality then your low-level survival drives will,
>> quite rightly, swamp your higher level needs for a time. This would
>> be true of any conceivable sentient that had low-level survival
>> minimal requirements. So please come down to earth just a bit, if you
>> will.
>
>
> Um... no. I refuse. Life on Earth just before the Singularity should
> be about the Singularity because the Singularity is what matters for the
> next billion years. This doesn't change regardless of where you live,
> although whether or not you are likely to *know* about the Singularity
> may have a differential probability that varies strongly with geographic
> location. Now it may be that low-level needs place you at subsistence
> level and leave very little free energy, but if you know about the
> Singularity, you should still be concerned with it, with whatever free
> energy you have. And it may even be that being placed in certain
> environments is enough to disrupt or destroy an altruistic goal system.
> But in this case I would consider, not quite that "you" have died, but
> that you have been transformed into something whose philosophical
> judgments you should no longer take into account.
Ah. OK. I do have a lot of respect for this. But I will point
out that if you substituted "the Kingdom of God" for
"Singularity" above you would still have a self-evidently true
declaration of dedication modulo your belief in said Kingdom or
Singularity as the ultimate end and truly important thing. But
I am not attempting to poke fun in any way. I very much do
understand what you are saying above and admire your dedication
and articulation of it. I would however sound a warning that
you may be missing how the Singularity grows out of the present
and the people of this world and thus tend to ignore them in
favor of it. Or so it sometimes appears.
>
> Dedication isn't a trivial thing, Samantha. I don't know how strong my
> will is. I don't know what it would take to break my will, whether it's
> a little or a lot of adversity. But it doesn't matter. A broken
> Eliezer, however little or however much it takes to break Eliezer, is
> not relevant to this discussion. While my will continues it will be
> devoted to the Singularity because that is the pretty darned unambiguous
> outcome of rational altruism. And if I could continue that under
> whichever conditions of adversity, I would be personally proud of it -
> although obviously Singularity considerations mediate that I should try
> to encounter as little adversity as possible; this isn't about pride.
>
> "If Brahma has me burnt, I will spit into the flames. If he has me
> strangled, I will attempt to bite the executioner's hand. If my throat
> is cut, may my blood rust the blade that does it."
> -- Sam, in Roger Zelazny, "Lord of Light"
>
> I don't know if I could, but I know that I *should*, remain focused on
> the Singularity on the way to my execution, and shout out my last
> thoughts about Friendly AI to the firing squad to take advantage of the
> very small but nonetheless real probability that someone will remember
> them and put them online. Not out of defiance; because that is, under
> those hypothetical conditions, the best remaining pathway to helping
> humanity and humanity's future. It's not a *good* pathway, but it's
> what's available under those conditions. And I do not deny the
> possibility of that kind of dedication to any of the six billion people
> on this Earth, regardless of what conditions they live under.
>
> I think you do humanity a disservice if you suppose that human beings
> are capable of altruism only under comfortable conditions. Maybe it's
> true of me personally. If I do my job competently, it will never be put
> to the test. But if I do fail that test, that's a flaw in me, not
> something that changes the correct course of action.
I don't in fact believe the altruism is only possible in
comfortable conditions at all. But it is a strange sort of
altruism, at least at first glance, that seems to stand
dispassionate in the face of real human suffering now all over
the world. Yet this too makes a great deal of sense. It also
is reminiscent of various spiritually dedicated people who point
out it is not the material circumstances that matter but the
internal state and eventual outcome. Dispassion is a necessity
of seeing beyond and working beyond the immediate.
>
>>> Why is it, Ben, that you chide me for failing to appreciate
>>> diversity, yet you seem to have so much trouble accepting that this
>>> one person, Eliezer, could have an outlook that is really seriously
>>> different than your own, rather than some transient whim? I don't
>>> have any trouble appreciating that others are different from me, even
>>> though I may judge those differences as better or worse. You, on the
>>> other hand, seem to have difficulty believing that there is any
>>> difference at all between you and someone you are immediately talking
>>> to, regardless of what theoretical differences you might claim to
>>> believe in or respect.
>>
>>
>> Why is that you are beginning to take this attitude of being above it
>> all and almost of a different species from the rest of us? As
>> wonderfully bright and dedicated as you are I don't believe that this
>> is justified, at least not yet.
>
>
> Why not say: "No matter *how* bright and dedicated you are, or aren't,
> that attitude would *never* be justified." This helps to avoid debate
> about side issues.
>
OK. Hmmm. I see your point.
>>> Suppose that I did tend to focus more on material poverty if I were
>>> experiencing it. That supervention of my wired-in chimpanzee
>>> priorities is not necessarily more correct.
>>
>>
>> If it is the difference between life and death, then it is higher
>> priority in that it is prerequisite to the rest of your goals. There
>> must be enough surplus of energy beyond what is needed to survive and
>> accomplish some basic functionality before higher goals can be
>> addressed. Many people in this world do not have that much today.
>> That is also potentially many brains of good potential that are never
>> utilized.
>
>
> I agree. This is yet another problem that can best be fixed via (drum
> roll) the Singularity. That's the most effective, most efficient,
> fastest way that I can put any given amount of effort into solving that
> problem. If I do it some other way, I fail. If I do it some other way
> because of a reason other than my anticipation of maximum benefit to
> those people, I fail in altruism.
>
OK. Thanks for the clarification. Sometimes the "anticipation
of maximum benefit to those people" is not so obvious.
>>> "Vastly"? I think that word reflects your different perspective (at
>>> least one of us must be wrong) on the total variance within the human
>>> cluster versus the variance between the entire human cluster and a
>>> posthuman standard of living. I think that the most you could say is
>>> that some humans live in very slightly less flawed conditions than
>>> others. Maybe not even that.
>>
>>
>> Your perspective includes hypotheticals not currently in existence.
>
>
> Um... yeah. And this is a bad thing because...
>
Only because consideration of these hypotheticals over and
beyond unquestionable existing realities can lead to ineffective
apprehension and response to those realities. On the other hand,
a projection of what the goal or best state is and working
vigilantly toward that can be the very deepest response to those
obvious problems and difficulties. There is a question of balance.
>> Given current existential conditons, some humans live in vastly more
>> flawed conditions than others.
>
>
> Adversity isn't a relative quantity, at least not as I measure things.
> Whether you're doing well or poorly doesn't depend on whether someone
> else has more or less. Right now all known sentient beings live under
> conditions of tremendous adversity.
>
I very much agree. However, I don't currently believe that only
Singularity is of importance for improving the lot of humankind.
>>> > As a person of great material privilege, you are inclined to
>>> > focus primarily on the limitations and problems we all share.
>>>
>>> As a student of minds-in-general, I define humanity by looking at the
>>> features of human psychology and existence that are panhuman and
>>> reflect the accumulated deep pool of complex functional adaptation,
>>> rather than the present surface froth of variations between cultures
>>> and individuals.
>>
>>
>> Hmmm. That froth is where the people live!
>
>
> No, that froth is what the people focus on, because those variances are
> the ones which are adaptively relevant to *differential* reproduction
> and hence perceptually salient. If there exists a race which has
> evolved to be six miles tall because height is the most powerful
> evolutionary advantage, they will, among themselves, focus on the
> remaining six inches worth of variance. Hence the IQ wars.
>
Understood but I was attempting to express concern that much of
what human beings are seemed to be dismissed above.
>>
>> I think you are attempting to turn yourself into a FAI disconnected
>> from your own humanity. I am not at all sure this is a good thing.
>
>
> 1: What the heck *else* do you think I've been trying to do with my
> mind for the last six years?
>
> 2: If you think that is in any wise, shape, or form a bad thing, you
> must have an extremely different conception of FAI than I do.
>
> 3: You've never met an FAI or you wouldn't say that. (Neither have I,
> of course, but I have a good imagination.)
>
> 4: If it's good enough for an FAI, it's good enough for me. The only
> question is whether *I* can manage it.
>
> As for being "disconnected from my own humanity"... is this a generic
> way of saying "different from what you were last week"? And if not,
> disconnected from humanity and connected to what, exactly?
>
I understand that also and figured that is the way you see it.
However, you are a human being and I believe it is more fruitful
to the creation of a FAI and to your own well-being and to
enrolling other human beings if you retain your humanity a bit
more explicitly. Now, I also understand that when you are
dedicated to such a goal that it very much does change you and
make you in many ways different from other people. Some of that
is inevitable but there is also danger there. I believe that
you are aware of it.
That it is "good enough for the FAI" does not equate to it being
good enough for you. Yours is a drastically different
architecture and situation.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT