Re: Can't afford to resuce cows (was Re: Arbitrarily decide who benefits)

From: Lee Corbin (lcorbin@rawbw.com)
Date: Sat Apr 26 2008 - 13:50:01 MDT


Samantha puts forth a basically extremely sensible list here:

> If I was the AGI (and though more or less like I do today) and was
> charged with the ultimate well-being of all sentients then my solution
> would be simple.
>
> 1) upload all sentients into worlds identical to their current worlds or
> of their choice for more evolved sentients;

E.g., humans have a choice in the matter, but cows are too
stupid to even understand such a choice. I completely agree.

> 2) by design all sentients have up to the moment back-ups;
> 3) let them live by whatever rules (or defaults from their previous
> conditions) that they choose;
> 4) if they off themselves or 'die' or come to serious injury they are
> reinstated but likely without much memory of what came before
> but loaded up with issues to work through from before;

If accident befalls them, then there is no reason to tamper with
their memories at all. Otherwise, rather than letting them terminate
themselves (as they're sure to do again), either talk them out of
choosing self-destruction, or instantiate a version of him or her that
will be as close as possible to the original, but will want to keep on living.

> 5) churn so each sentient becomes more and more enlightened / reaches
> its highest potential at its own pace;

Some human entities will not choose such a thing, what I
call a "path of personal evolutionary progress". Take a
devout traditional religionist. There's nothing wrong with
giving him his 1mm^3 of the material resources he requires,
and the 1 second per real second of runtime [1] he requires,
to live as he wishes indefinitely.

> 6) interfere only as judiciously and minimally as possible to avoid
> forcing the outcome to something other than what the sentient
> would ultimately choose.

Right. Which is why I amended your #5.

> In short a full VR multiverse with perfect reincarnation overseen by a
> fully benevolent God/Mind. Otherwise I think universal or perfect
> Friendliness is a rather nasty farce.

I wouldn't call it a farce, exactly, but I know what you're getting
at and I agree completely. Silicon/diamondoid on Earth alone will
provide ample and relatively incredibly small resources per
individual human (as they're now constituted), and it would be
greedy and unappreciative of the FAI to do anything less for humans.

Now the question of how much resources should more advanced and
ambitious humans receive is a vastly harder question. Perhaps as much
as an entire cubic meter, at 1sec/sec of processing. But we can leave
that, conveniently, to the FAI.

Lee

[1] One second of subjective time per second of objective time.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT