From: Michael Vassar (firstname.lastname@example.org)
Date: Tue Aug 22 2006 - 07:39:36 MDT
>150 is a very, very high threshold, just beyond the third sigma,
Well, sort of. Empirically an IQ of 150+ does NOT enable most people to
understand SL4 issues *at all* even when instructed. It's a high threshold
by some standards but not by others. Unfortunately, we don't have any well
defined technique for identifying people at a level beyond this, but as I
explained on this list a year ago in this post
and this one
we actually have repeatable methodologies for detecting "propensity for
important intellectual accomplishment" at a level higher than that
measurable by IQ tests, specifically whatever methodology Cal Tech uses in
selecting its student body, a methodology which gives them an IQ 140+
student body without using IQ tests but gives a three times higher incidence
of high level intellectual accomplishments than would be predicted from any
>only corresponds to 1/1000 of the population. The difference between
>IQ 150 and IQ 140 is huge, and the difference between IQ 140 and IQ
>130 is huge.
The articles I linked to in my earlier posts document a roughly 60% increase
in probability of high level intellectual accomplishments between an IQ of
140 and an IQ of 155, e.g. the 99th and 99.99th percentile. I would hardly
call that huge. It implies that the vast majority of important intellectual
accomplishments will by by people with tested IQ scores below 155. By
contrast, the difference in probability of major intellectual achievement
between an IQ of 120 and 130 is so huge that one would be hard pressed to
thing of any person with outstanding mental achievements in any field
without an IQ of 120, and also so huge that pretty much everyone in the 130+
IQ range is in stark denial of what the mental abilities of most of the
population are, to a point which I believe severely impairs their efficacy
in many tasks. The difference between an IQ of 110 and 120 is that between
a person who essentiall can and one who essentially cannot learn the
scientific world view, math beyond arithmatic, and literacy sufficient to
read the higher grade of newspaper or any sort of literature.
> This is why Eliezer writes about prospective seed AI
>programmers, "If we were to try quantifying the level of brainpower
>necessary, our guess is that it's around the 10,000:1 or 100,000:1
>level." The difference between 160 and 150 could also be huge,
I agree that there are people who show far more mental ability than is
predicted by an IQ of 150, but the set of people who show such ability seems
empirically to have little overlap with the set of people with IQ scores far
>The idea that IQ doesn't matter "all that much" is among the most
>frequently repeated myths with the greatest amount of available
>evidence to demolish it. The whole "emotional intelligence" thing is
>pop-psychological nonsense, considered fringe thinking by the
>mainstream psych community. IQ matters for social skills. IQ matters
>for real-life achievement. IQ matters for common sense.
For a counter-argument based on actual empirical evidence, see
The author is using a 10 question vocabulary test as a proxy for IQ, but he
is correct in the belief that such a test probably has a g-loading large
enough to generalize its results to IQ, though not to transfer its results
I had previously encountered very mixed data on IQ and social skills,
including empirical data by g realists who claimed that IQ predicted
physical skills better than social skills.
>for which people will be able to wrap their brains around what I'm
>arguing in this post and for which people it will go in one ear and
>out the other.
That's terribly close to ab-hominum. Also, as I mentioned, untrue. Lots of
people on this list have IQs at the highest levels and can't wrap their
brains around the most rudimentary and obvious SL4 conclusions.
>And by the way, discussions of IQ are most definitely SL4.
>Conversations about intelligence and comparisons between intraspecies
>and interspecies intelligence differentials make up the core of the
>argument that most people on this list *still* don't understand: the
>Singularity is about smartness, not technology, and the second you
>build something smarter than you, everything we know flies out the
Well yes, but the sorts of advantage that an AI, even one that was not a
superintelligence, would likely have relative to a human don't necessarily
closely resemble those a smarter human would have unless the AI is an
upload, or at most a neuromorphic AI. IQ metaphors can be mis-leading.
After all, much less intelligent parents routinely manage to keep control
over their more intelligent children by virtue of superior knowledge,
physical power, social support, etc, for quite a while, a situation entirely
unlike that with a UFAI.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT