From: Thomas Buckner (tcbevolver@yahoo.com)
Date: Sun Mar 13 2005 - 07:34:09 MST
--- Ben Goertzel <ben@goertzel.org> wrote:
>
> Hi,
>
> David Hart just pointed out this guy to me
>
> http://en.wikipedia.org/wiki/Masahiro_Mori
>
> I had never heard of him, but maybe some of you
> have.
>
> It's some fairly wild robo-Buddhistic
> philosophy ;-)
>
> The comments at
>
> http://www.karakuri.info/perspectives/
>
> about differences btw Japanese and Western
> attitudes to robotics seem
> right-on.
>
> Westerners tend to fear uberhuman killer
> megalomaniac robots, whereas
> Japanese tend to think of robots as their
> little buddies...
>
> Of course, each perspective has its partial
> truth to it...
>
> -- Ben
>
I've heard of him, from Tipler's Physics of
Immortality I think, and used the following quote
in my own book:
". . . to learn the Buddhist way is to perceive
oneself as a robot."
— Masahiro Mori
Because I thought, "He really gets it!" For some
reason I haven't read any of Mori's books; IIRC
the local library had none. In any case, this is
one of the memes that lead me to think (though I
cannot prove) that any real superintelligence is
apt to have the attitude of a bodhisattva (one
who chooses to stay in the trenches and help the
rest of us reach Buddha-hood). I thought of using
this in my long-planned AI novel, in a way I
think I won't mention.
This might be a good place to mention a concept
that may be useful in thinking about Collective
Volition and FAI. In some types of software, such
as music software, we have the idea of
'nondestructive editing.' In nondestructive
editing, you mave, say, a WAV sound file you just
recorded, a piece of video, etc., and a copy is
kept so that no matter what you do, if the
results turn out badly, you still have a pristine
original.
I have said before that, given the cussedness and
unreason of many humans, and given the utterly
incompatible volitions many of them have, a FAI
would have no choice but to override their wishes
at least some of the time. If religious fanatics
wish to see the world blow up, it can't very well
let them.
In a way, all discussion of FAI is like
'designing a genie.' If the genie gives you
exactly what you ask for, you're apt to get
something you don't want (every joke in which a
genie appears is a parable on this topic). We
want the genie to understand our wishes and needs
*better than we do*. And since CV refers to what
our better selves want, we can even foresee that
we are giving the genie some permission to make
us those better selves.
We want ver to edit us (if it must edit us)
nondestructively. It may be well to make it
explicit, as a supergoal, that humans will be
improved wherever possible, *with the option of
reversing errors*. And in order to do that, we
need a very clear idea of what such improvement
would entail. What do we stand to gain? What are
we unwilling to lose? Even in these questions CV
is muddled; there are those who would prefer to
see the human race as sexless as statues, unable
to appreciate music, inert in the presence of
strong drink or sweet leaf, for in their eyes
these things are sinful. A FAI might find verself
forced to sandbox a large portion of the human
race in illusory worlds suited to their
preferences, or else alter them against their
stated wishes before they could accept the
vertiginous freedoms it might offer. As far as I
am concerned, however, improvement always means
more freedom.
Nick Bostrom offers convincing (to me, though I'm
no expert) arguments that we are already in a
simulation (or that no simulation will happen in
the future). If this were so, there are a range
of different types of simulation we might
inhabit. One might speculate that one such
simulation would be a 'soft sided sandbox,' so to
speak, in which the 'safe copies' of altered
humans might reside, either frozen in time or
experiencing ongoing lives where FAI never forced
them outside their comfortable, limited reality
tunnels (or, as another poster called them,
magisteria). The 'soft sides' of such a sandbox
refer to its essentially protective nature and to
the option of escape if one finds it confining;
an essential feature of a soft-sided sandbox
would be that it contains instructions for
escape, if one cared enough to look.
Even more than 'designing a genie' we might say
FAI research is 'designing a bodhisattva.'
Tom Buckner
__________________________________
Do you Yahoo!?
Yahoo! Small Business - Try our new resources site!
http://smallbusiness.yahoo.com/resources/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:50 MDT