Re: Shocklevel 5

From: Jeff Bone (jbone@jump.net)
Date: Sat Dec 08 2001 - 15:04:53 MST


Alden Jurling wrote:

> Leaving off the sysop for the moment, lets say you are a Power and you want
> to survive for as long as possible.
> Lets say that you decide that to do that by spreading as quickly as
> possible. As time passes the chance of any particular possible disaster
> occurring rises, so your chance of survival drops. But as the space/mass
> you control expands the set of disasters that could potentially destroy you
> grows smaller and smaller, and your chance of survival rises. So you have a
> race between increasing risk and decreasing vulnerability.

Precisely my point --- *and* it's not just a question of spread, it's also a
question of architecture (how much of that mass is necessary to constitute "you,"
how much can you lose to local disaster without effecting your goals, etc.) and
substrate (i.e., dark matter reduces certain risks, may create others that we are
not aware of yet.)

> Is it a foregone
> conclusion that risk will win out in the long run? It seems that you might
> be able to last as long as the universe can support your mass and energy needs.

So the final risk --- the one that is ultimately certain in an open universe --- is
that 2LT wins out and the universe cannot sustain the necessary energy needs. So
eventually, you're done for under a certain set of assumptions. But IMO, if you
*get* to that point, past all the other risks, that's a substantial thing.

One way to avoid the heat death might be the creation of a steady-state (i.e.
"flat," neither open or closed --- or oscillating controllably between those
states) pocket universe that could then be closed off from the regular one. You
still have an issue with 2LT, but it's not clear what frame of reference 2LT refers
to in all cases --- there's a lot still to understand about the relationship
between entropy and the fine structure of spacetime. Assuming you're doing
engineering *with* the structure of spacetime itself, you might be able to
manipulate such things.

> Also, is there necessarily a significant difference between personal and
> 'species' survival? (if your backups are distributed enough any calamity
> that destroyed all of them would necessarily destroy the whole species).

Probably not an important distinction overall, though it probably comes up in the
context of individual "moral" decisions a Sysop might have to make. E.g., say a
particular constituent was facing sure and undesirable annihilation, and the only
way the Sysop could save them was to "sacrifice" some other constituent, all other
constituents, or even itself. In that case, the difference must be considered.
Overall, assuming Friendliness, the perpetuation of the Sysop is good for
everybody.

> Im not sure if there is really a point here or not.

It's all worth considering. :-)

jb



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT