From: Mark Nuzzolilo (email@example.com)
Date: Tue Nov 03 2009 - 15:25:38 MST
The question is in the realm of Intelligence Amplification (IA) which is
named as one of the avenues to the singularity.
It would seem to me that these concepts seem to give a telltale hint to the
potential of intelligence amplification (depending on what your definition
of "intelligence" is) -- unfortunately any legitimate theories written about
this would be hard to do due to the lack of ethical research on this
There are plenty of motivated intelligence researchers on this list. If
this gives them any ideas, great. If not, then I'm not going to lose any
sleep over it. I wasn't planning on having a debate over this, just wanted
to throw the concepts into the mix, and I thought it would be a welcome
change from what has been going on here as of late.
On Tue, Nov 3, 2009 at 3:16 PM, Gwern Branwen <firstname.lastname@example.org> wrote:
> It sounds to me like you are asking what are the limits of weak
> superintelligence. (Or if you aren't, it's a boring question about
> human limits over a day or two, and not really SL4 material.)
> I don't know any convincing evidence about what a human can do given
> subjective centuries, requisite tools, and incentive to practice. I
> would suspect that they could eventually become a master of the
> technical aspects, and be much better than they started out.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT