From: Samantha Atkins (samantha@objectent.com)
Date: Tue Sep 17 2002 - 01:07:08 MDT
Chris Rae wrote:
> Aaron,
>
>> Anyone who thinks humanity can be prepared for the singularity doesn't
>> understand what singularity means.
>
>
> I totally disagree with this statement Aaron. The singularity is NOT
> about technology, it is about FREEDOM of the human race from suffering.
> Since my last post, approximately 30,000 children under the age of 5
> have died as a result of malnutrition or preventable disease. In case
> you haven't noticed, outside of the 'free' world, life is defined by
> hardship, misery and suffering.
Some good questions there. But life is not defined by hardship,
misery and suffering. It is self-defining as life. It is a
great deal harder than it needs to be for the vast majority of
earth's inhabitants, even now, years before Singularity. If we
are indeed motivated by caring for all this suffering humanity
then why not also act now to improve the lot of as many as
possible? Ah, because all energy is needed to create the one
Great Fix for all. Well, that *would* be nice. But it is
difficult to not do what can be done in the meantime.
>
> Lets say -hypothetically of course - I hand you (Aaron) a true AI entity
> that is able to improve its own processing capability. What exactly are
> you going to do with it? Sure, it can provide you with everything you
> ever dreamed of, but what about the other 6 billion people on this
> planet? What is their fate? Are you going to allow these people to
> choose their own destiny, or are you going to force your version of
> what's best on them?
>
Good question.
> For example, when a human rights group used FORCE to stop an ancient
> ritual that involved the mutilation of female genitalia as a right of
> passage, it was the women who fought back claiming it was their right
> and no one had the authority to take that away, even though the human
> rights group was acting in the best interest of the women.
>
Horrible example. This rite of passage has been forced in order
to prevent women from fully being able to enjoy sex to
supposedly protect their purity but actually to protect their
husbands from any thought of straying for better sex partners.
In strictly human terms the practice is barbaric and deplorable
in the extreme. Stopping it is not an unreasonable use of
force. Using force to stop the use of of forceo on unwilling or
unwitting victims is not a clear wrong. It might have been
better to educate the practice away or in the future just repair
the damage.
There is a great deal of difference between this ambiguous case
and the harm that oculd be done by the incautious and
over-exuberant actions of an AI, self-directed or otherwise.
> My point is, if/when singularity occurs, it can't be forced on the
> general population, they must willingly accept the technology. IMO,
> creating the technology necessary to achieve singularity is only half
> the battle, the other half is convincing the people it is going to
> improve their lives. The true goal of the singularity is in fact to
> improve peoples lives, isn't it?
>
I agree a lot with your conclusions and with the focus. It is
not universal here though. For some the goal is creating more,
preferably maximally more and better, intelligence whether that
is good at all for people and their lives or not.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT