From: Keith Henson (hkhenson@rogers.com)
Date: Wed Nov 03 2004 - 22:59:11 MST
At 05:13 PM 04/11/04 +1300, Marc Geddes wrote:
> --- Eliezer Yudkowsky <sentience@pobox.com> wrote:
>
> >
> > 56 people donated
snip
>Sitting around fantasizing about building monuments
>post-Singularity won't get us anywhere. We're
>currently stuck on the other side of the Singularity -
>and what a banal, brutish existence it is!
No kidding.
>Worse, AGI theory seems to be getting more
>complicated. There is more to CV than to earlier
>versions of FAI theory, and it would appear to my eyes
>that for every AI problem Eliezer is solving, two more
>are popping up. My suspicion is now overwhelmingly
>strong that (a) CV cannot be calculated by a
>Singleton, (b) that completely selfless AGI is
>impossible, and (c) that real-time general
>intelligence without sentience is impossible. If I'm
>right, even CV is not the last word, and yet another
>major theoretical change is yet to be made by Eliezer.
Coming at this business from the EP side, I think Collective Volition might
not even be a good idea, though it might be that I don't understand some
special meaning beyond the obvious meaning of the words.
The problem is stressed humans. That includes those who think they see
"looming privation" in the future and the ones who have been switched into
a degraded mode of thinking (non-rational) by a physical attack. The
"collective volition" of such people is scary.
If you want an example, consider the expressed "collective volition" in the
just past election.
I am not trying to replace Eliezer's proposals with something from me
because I don't have any good ideas. Hell, I don't even know what to do
with my insights about the psychological mechanisms that are causal links
to wars.
Keith Henson
This archive was generated by hypermail 2.1.5 : Tue Feb 21 2006 - 04:22:48 MST