From: Marc Geddes (marc_geddes@yahoo.co.nz)
Date: Tue Jun 01 2004 - 23:59:59 MDT
I guess I'm just a little diappointed. It still seems
kinda like a 'cop out' to me.
I thought the whole idea was to design an AI capable
of reasoning about morality (which to my mind
implicitly assumes an 'Objective Morality' independant
of opinion which could be discovered through the use
of reason - Hume's 'You can't derieve ought from is'
dogma not withstanding)
The collective volition 'solution' seems rather like
building an AI and being given the following the
answer:
'Oh I don't know what morality is, I'll just throw the
question entirely back on you humans! Tell you what,
you humans can vote on it - give me your opinions and
that's what I'll take to be morality. Cheers! ' ;)
Seems like a terrible waste of super-intelligence to
me. Perhaps the whole concept of 'Collective
Volition' has gone over my head, I don't know. I'm
sticking to my guns and still favoring the 'Objective
Morality' approach.
I do of course still fully support Sing Inst (after
all, I realize I probably don't know what the hell I'm
talking about here - Eli knows a lot more than I do)
But I can't help but be disappointed with 'Collective
Volition'
=====
"Live Free or Die, Death is not the Worst of Evils."
- Gen. John Stark
"The Universe...or nothing!"
-H.G.Wells
Please visit my web-sites.
Science-Fiction and Fantasy: http://www.prometheuscrack.com
Science, A.I, Maths : http://www.riemannai.org
Find local movie times and trailers on Yahoo! Movies.
http://au.movies.yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT