Re: Question about CEV

From: Stathis Papaioannou (stathisp@gmail.com)
Date: Mon Oct 29 2007 - 18:16:13 MDT


On 30/10/2007, Thomas McCabe <pphysics141@gmail.com> wrote:
> The FAI, basically, extrapolates what would happen if *we*
> reprogrammed our motivational systems. If someone has a mental
> illness, and wants to get rid of it, the FAI will extrapolate their
> desires without the illness interfering. This is also the process used
> to get rid of "stupid stuff we did when we were young"-type things
> that have been mooching around in our heads- the FAI tries to simulate
> the natural self-debugging process we all seem to go through. As for
> people who are mentally ill enough to not be capable of understanding
> what we mean by "mental illness" and why it is a bad thing, quite
> frankly, I'm not sure how to handle them.

What happens today is that people who lack insight into their mental
illness and are suffering because of it have treatment forced upon
them. Mental health services estimate their CEV, and act accordingly.
If someone does not have a diagnosis of mental illness, on the other
hand, they are allowed to act self-destructively if that is their
choice.

There is a danger that an AI may decide that we are all in a position
analogous to the insightless mentally ill, with poor judgement on
account of our relatively low intelligence and primitive drives. I
would feel much better if the AI did exactly what I wanted. If I
wanted it to do what it thought some idealised version of me wanted,
then that is what I would tell it to do. I don't want it guessing that
an idealised version of me would want it to do what an idealised
version of me wanted etc.

-- 
Stathis Papaioannou


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT