From: Michael Roy Ames (michaelroyames@yahoo.com)
Date: Sat May 22 2004 - 11:32:48 MDT
Eliezer,
You wrote:
>
> I was speaking of me *personally*, not an FAI.
> An FAI is *designed* to self-improve; I'm not.
> And ideally an FAI seed is nonsentient, so that
> there are no issues with death if restored from
> backup, or child abuse if improperly designed
> the first time through.
>
Your definition of 'sentient' must be substantially different from mine when
you suggest that an FAI seed might be nonsentient. Could you give us your
working definition for 'sentient'?
I am heartened to see that you are thinking of the issues of death and
(?child?) abuse for created beings. It is something that has bothered my
conscience on several occasions when thinking about the topic. However, it
is likely we can only go so far in avoiding/preventing death and abuse when
research is being done, simply from lack of knowledge and experience with
the issue. When so much is at stake, we must not second guess ourselves so
much that we loose the game. Reasonable precautions: yes. Completely
provable avoidance of abuse/death: no. Now, if we *can* completely avoid
abuse/death without major time delays, then that is wonderful. But we all
hear the ticking of the clock.
>
> Again, I do not object to the *existence* of a
> source control system for humans. I say only
> that it should be a last resort and the plan
> should be *not* to rely on it or use it.
>
This would seem to be a personal choice issue - the use of source control
for beings, human or not. Personally, I would want to use it - but it would
be a 'last resort' option.
Michael Roy Ames
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT