From: Samantha Atkins (samantha@objectent.com)
Date: Mon Jan 29 2001 - 02:52:21 MST
Spudboy100@aol.com wrote:
>
> Here,on page 167, in Robot: 1999,Oxford University Press
> "Thus an ultimate cyberspace, the physical 10^45 bits of a single human body
> could contain the efficiently coded biospheres of a thousabd galaxies-or a
> quadrillion individuals each with a quadrillion times the capacity of a human
> mind.
Please explain to me how a single biological body can encode all the
biological bodies and the entire biospheres of a thousand galaxies.
Obviously, the "bits" of the things to be coded grossly exceed the bits
available. So this is not a bit-wise encoding. What is considered
redundant and what is the coding algorithm and how exactly do we know it
well enough to make such a wild claim about the possible size of the
encoding? The stuff in the book at this point is extremely seat of the
pants. Even if you could encode it thus exactly how much more capacity,
how many more bits would be needed to actually run this encoding in any
meaningful way? While we are at it, exactly why would an encoded
simulation of the universe within one part of the universe be a more
efficient rendition or running of the universe? This seems on the face
of it highly unlikely. In practice it takes considerably advanced and
copious hardware to simulate biological systems fully down to the
cellular level or anywhere close. Perhaps I am missing something.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT