From: Norman Noman (overturnedchair@gmail.com)
Date: Tue Oct 17 2006 - 08:05:07 MDT
On 10/17/06, Stephen Tattum <S.Tattum@dundee.ac.uk> wrote:
>
> Is Climate change an SL4 issue?
Not really
Would a FAI perhaps be the best way for the human race to avoid
> destroying our own planet?
Yes
For example say someone programmed an AI whose morality was grounded on
> environmental concerns - even an emotional love for Gaia. Would this
> mean that people would be despised and eventually culled to protect the
> earth - or as part of the whole system, just a part that leads to too
> much dangerous rapid change - would our population be controlled - would
> it need to be?
How humans would be treated depends entirely on the details of how the AI is
programmed. If the goal structure is vague or mistakes are made in its
definition, humanity would likely be wiped out, as would most of the rest of
the universe.
Is it even perhaps the case that without a FAI helping us humans,
> seeing the bigger picture and being able to organise and arrange things
> accordingly, we are doomed. Is FAI possibly the only way to save the
> world and our species?
>
Yes, but climate change is not the reason. As threats to humanity go, it's
not even in the top 10.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT