Re: Unbounded happiness

From: Lee Corbin (lcorbin@rawbw.com)
Date: Thu May 01 2008 - 01:35:42 MDT


First, thanks to Stuart for pointing out Anders' paper
http://ftp.nada.kth.se/pub/home/asa/Work/Brains/Brains2/
He doesn't have too much directly to say about what
kinds of identity transformations a human being would
have to undergo when becoming a Jupiter-sized
Jupiter brain, but the last four paragraphs of
http://ftp.nada.kth.se/pub/home/asa/Work/Brains/Brains2/node14.html
are pertinent.

Kresoski writes.

> [Lee wrote]
>
> > Bottom line: large brains should have no reason to
> > choose to operate slowly, so therefore thought
> > will be conducted at c, and therefore what an
> > "individual sentience" is will be limited in size.
> > Now yes, I can even now perform "library inquiries",
> > get an answer to a math problem by letting my
> > computer run long enough, but in neither case is
> > the library or my computer to be considered part of me.
>
> See that's the issue that I have with it-- what do we want
> to consider as part of you right this moment? Firstly we
> have an autonomic nervous system that we are completely
> unaware of-- is that part of you?

No, not at all. I can't go so far as to say that *nothing*
which I am not aware of can be part of me, for a lot of
thinking and memory reference is unconscious, but on the
other hand, heart rate, digestion, breathing, respiration rate,
salivation, perspiration, diameter of the pupils, etc. (taken
from Wikipedia's list in their ANS article) simply are not
an important part of who I am.

> What about your own memories, some of which are not
> consciously accessible right now? What if I were to create
> a neural implant that vastly improved the rate at which I
> could exchange information with the outside world, connect
> to a series of other computers, or individuals with similar implants.
> What consequence does this have for my own individuality?

Excellent questions. Take what I consider to be "my memories"
right now. I'm very familiar with Euclid's Algorithm, not only
for finding the Greatest Common Divisor, but for finding integer
solutions to 107x - 337y = 1. This is a pretty discrete bundle
of knowledge. It's mine right now: if damage occurs to it (I
forget part of one of the algorithms) then that's only happened
to *me*. But what's going on here? Clearly if the tech you
suggest enabled a lot of people to access those same memories,
then it wouldn't feel as though it were *mine* anymore.

Well, I cannot answer these last two questions either. I do
only know that I'm wary of embracing those technologies,
and would never do so except for a limited number of my
us (my duplicates).

> Let's assume that we have a solar system brain that is
> massively parallel in a way analogous to our own brains.
> Is it so inconceivable to have a scenario whereby the
> cognitive experience is 'slowed down'?

Not at all! I was merely saying that you'd have to
artificially slow it down to maintain what we think
of today as a human mind. As Anders wrote

<The subjective effects of S depends on the application. For data retrieval and communication, it just creates a subjective delay
which may or may not be acceptable (a delay of a minute in delivering an email is usually acceptable; a one-minute delay in
delivering a frame of video is not acceptable). Subjective distances increase for very fast minds; for entities exploiting
nanosecond timescales at the speed of light distances of centimeters are significant, for femtosecond entities micrometers and for
nuclear entities femtometers. Structures larger than this will be ``large'' compared to the processes that go in them.

<For infomorphs, delays limit the physical distribution of their component processes: if they are too far apart, the being would
have to slow down its rate of subjective time in order to keep synchronized. Even if the processing is infinitely fast lightspeed
limits the speed of infomorphs if they wish to interact with the outside environment at a certain rate; since the human mind acts as
a whole on a time scale of hundreds of milliseconds, a human-like infomorph running at ``normal'' speed would at most be able to
extend 30,000 kilometers before the delays started to limit its speed.>

Not at all! I was merely saying that you'd have to
artificially slow it down to maintain what we think
of today as a human mind. As Anders wrote

    The subjective effects of S depends on the application. For
    data retrieval and communication, it just creates a subjective
    delay which may or may not be acceptable (a delay of a minute
    in delivering an email is usually acceptable; a one-minute
    delay in delivering a frame of video is not acceptable).
    Subjective distances increase for very fast minds; for entities
    exploiting nanosecond timescales at the speed of light
    distances of centimeters are significant, for femtosecond
    entities micrometers and for nuclear entities femtometers.
    Structures larger than this will be ``large'' compared to the
    processes that go in them.

    For infomorphs, delays limit the physical distribution of their
    component processes: if they are too far apart, the being would
    have to slow down its rate of subjective time in order to keep
    synchronized. Even if the processing is infinitely fast
    lightspeed limits the speed of infomorphs if they wish to
    interact with the outside environment at a certain rate; since
    the human mind acts as a whole on a time scale of hundreds of
    milliseconds, a human-like infomorph running at ``normal''
    speed would at most be able to extend 30,000 kilometers before
    the delays started to limit its speed.

> If we compare our brains to the cluster of nerves in insects for
> example-- do we have a less coherent experience since a neural
> signal takes longer to propagate across our cognitive apparatus
> than it does theirs?

Yes, I guess so.

> You say that "individual sentience is limited in size" -- well imagine
> we create a solar system computer. Does it then just spontaneously
> generate a myriad smaller 'sentiences' within it? where does one
> begin and the next one end?

I should amend what I wrote. I meant to say that "individual sentience
*of our familiar human kind* is limited in size".

> If you feel uneasy about talking about subjective experience
> (I do, it makes too many assumptions) we can still just talk
> about a solar-system sized computer, and how efficiently it
> processes.

And whether a human brain could be made that large (using
any technology whatsoever) without becoming very, very
slow in comparison to other humans.

> I don't see any reason why size would become a significant
> barrier.

Agreed. But if we want to retain human subjective experience,
it looks like a "human like infomorph running at "normal" speed
would at most be able to extend 30,000 kilometers before
the delays started to limit its speed."

> It would of course have a minimum amount of time that it
> would take to perform a calculation (the same is true of
> our own brains) but the sheer degree of parallel computing
> power would likely override the speed of light constraint
> in terms of efficiency of computation.

Agreed. But our human subjective experience would be
very much like that of consulting an "exterior" source of
information. That is, even if my neurons operate at c,
then information in my brain that is more than a mere
30,000 km away would start to feel external.

Lee

P.S. I didn't quote Anders' last two paragraphs on that
page which may closely support what you were saying,
because of time and space considerations in this email.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT