From: Leonardo Wild (firstname.lastname@example.org)
Date: Wed May 21 2003 - 16:21:56 MDT
Cliff Stabbert wrote:
> Oh dear. Do you often feel victimised?
It is perhaps due to having started off with the wrong foot when the
rules were not clear yet (to me). Seeing a lot of speculation about
"reality" being passed on as "scientific," has made me wonder where the
redshift of science starts on this list.
> >LW> I see that you don't value longwinded paragraphs, or that your value of
> >LW> them is lower (if not in negative) than short paragraphs that go
> >LW> directly to the point.
> CS: You see incorrectly. I value paragraphs of the right length to make
> their point. Yours did not, in my estimation, have any real point at
> all: discussing the abstract meaning, underlying values, whatever, of
> money in the context of SIAI's request for same seems completely
> irrelevant to me.
I was making a point about values and using your comment as an example.
You value what interests you. The question is: How to make a FAI value
humanity so it will be friendly towards us and not look upon us as
> CS: > Douglas Adams' observation about money is by no means unique, but it
> is a nice compact formulation.
Compact formulations don't always manage to show you the core of a given
reality or problem. The problem with money is far from abstract.
Naturally, if you have no interest in a given subject, you will feel
that any deeper contemplation and perusal of it as irrelevant.
> CS: i.e., other people
> believe that my twenty dollar bill has value and act accordingly,
> therefore I can act as if it has value.
That's another issue. Do you really believe that "my twenty dollar bill"
is _mine_ or if you hold the same that it is _yours_? If you hold money
in your hand, it doesn't necessarily mean that it's yours, just as it
doesn't mean that your renting a house makes it yours. But you do want
questioning it in the least, yet we still direct our lives towards the
acquisition of it.
And I'm not being facetious.
It is one of the core problems with money-as-we-know-it. (And I say
"as-we-know-it" because there are other kinds of money out there, and
money has evolved throughout history.)
> CS: > Come on, the money is a means to their goal, not SIAI's goal. You are
> being willfully obtuse.
I was trying to give an example of how basic values are based on what we
value, money being one of the pivot points of our pyramid of values and
one of the "goals" that so many human beings strive for nowadays ...
without know what it is nor where it comes from. That's the absurdity of
it. As absurd as us trying to "create" a being, an entity, that is
friendly, when in our social exchanges we are veering towards something
that isn't part of what I would consider friendly. In my view, one of
the elements inherent in friendliness is some degree of respect.
There seems to be a great effort in trying to ensure that friendliness
can come into existence or be part of AI, though the descriptions I've
read so far of "friendliness" (what it is and how it can be achieved)
have not really been clear. So, if you or anyone can help me out in
this, I would be very grateful indeed.
Returning to the issue of "values" and what it is that we value:
As a rule, value exists when three basic elements come together:
1) An organism (an individual, a family, a community, etc.).
2) A need.
3) The quantity (of the needed thing).
So we come up with the phenomenon that:
"The value of something depends on the inner need of an organism and the
availability (and quantity) of the thing needed."
This means that the measure or amount of “value” is not inherent in
something without the existence of organisms and their needs.
Besides, there are also three qualitatively different types of value,
and they are once again circumstantial:
1) Cultural Value,
2) Subjective Value.
3) Absolute Value.
Cultural value is whatever a group of people has, over time, considered
valuable, the object’s inherent “value” not withstanding.
Cultural value is something people can accept as “having value” even
though, in practical terms, it may be useless.
Subjective value is based on three aspects: an organism —be it an
individual, a community, a company, a nation, etc.— has a need for an
object or service. The value of whatever object or service will grow the
less that object or service is available. Also, certain things are more
subjectively valuable than others (emotional, intellectual, spiritual),
but if they enter the realm of “basic survival needs,” the subjectivity
of something valuable can turn into a near “absolute,” that is, into
If not having a glass of water means that I will die, that glass of
water has acquired a value so real and absolute, that so I'm willing to
give anything in exchange for the water. In other words:
"Subjective value is a circumstantial phenomenon related to place and
time and relates to a given need and the availability of what would
fulfill that need."
The third and least “measurable” type of value is absolute value:
Absolute value is the value of anything that exists, and is immeasurable
because the implications of absolute value go into the realm of the
spiritual and of processes that can hardly be fathomed by our limited
understanding of Reality.
This means that truth of this type of value is so great, that we cannot
grasp it. The closest we can come to comprehend “absolute value” is when
we speak of “real value,” that is, when our existence depends on
something and thus we come to value it “above all things.” In other
words, real value is a type of “subjective value” directly related to
our most basic needs of survival.
"Values," in this sense, are related to what we value. What
does/will/should an AI value in order for it to be friendly towards us
and those things that we need for our survival and subsequent growth as
individuals and as a species?
I believe that one of the keys to it is directly related to "need." If
we are not needed, an AI will have no reason to be friendly towards us.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT