From: Samantha Atkins (email@example.com)
Date: Wed Jun 22 2005 - 15:11:45 MDT
In my view MNT is essential if we are to get to a truly Abundant world
on multilple levels congruent with our benevolent goals and perhaps
essential to human survival. Many of the current global tensisions
that threaten devastation play off of actual or assumed scarcity.
Ecological dangers are much easier to diffuse given MNT. Life
exntension and ending aging and more or less ending unwanted death are
greatly enabled by MNT. Yes, MNT may be used for ill but the good it
enables is the very essence of what keeps me firmly in the
transhumanist camp. It is the veritable substance of many dreams long
The idea that a true AGI can be brute forced by more computational
power has had various times been touted as a near truism and as bunk
by many of the same voices. The notion that lack of tht much power
will make it more likely that those brighter folk into FAI will create
AGI faster than other bright people perhaps less interested in
Friendliness is questionable at best. A more worrisome question is
whether we can afford to wait for that slower development.
Relinquishment is not an option when it comes to MNT for reasons
capably discussed elsewhere.
I do not believe that the world can continue without one or both of
AGI (friendly or at least not decidedly unfriendly) and MNT much
longer (say as little as 5-10 years) without a major economic and/or
military catastrophe sufficient to put most of our dreams out of reach
for at least a generation. It could be a LOT worse than that.
We can't stand still and the paths forward are fraught with peril.
It is not obvious that getting MNT first will doom us. It not obvious
that SAI will be created any time soon even with MNT. FAI is less
On 6/21/05, Thomas Buckner <firstname.lastname@example.org> wrote:
> --- Phillip Huggan <email@example.com> wrote:
> > under present realities, an
> > AI of less than assured friendliness might be
> > less risky than facing the run-up to MNT;
> > turning on a "buggy" GAI system might be the
> > only defence against an emergent MNT enabled
> > tyranny.
> Although I somewhat agree with this, I suspect
> Eliezer would not; his position is that imperfect
> GAI is far, far more dangerous than almost anyone
> realizes. Both he and Ben have gone fairly silent
> on the list, which I think may actually be a good
> thing, since it implies they are too busy doing
> actual work on the problem to chatter about it
> with us.
> Tom Buckner
> Yahoo! Mail
> Stay connected, organized, and protected. Take the tour:
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT