From: Philip Goetz (firstname.lastname@example.org)
Date: Wed Feb 15 2006 - 08:39:12 MST
On 2/13/06, Dani Eder <email@example.com> wrote:
> There is copious evidence that the rate of
> and scientific progress has been accelerating over
> the last several hundred years.
Just to jump in - this is one of my interests.
It seems to me that the majority of evidence shows that
the rate of technological and scientific progress, and of
social change, has been slowing down since 1970,
possibly earlier. I don't need to get into an argument
about it, just let it be noted that I do not accept this
assertion without an argument, and that I hope others
will stop repeating this mantra which has never
been demonstrated empirically.
> There is good reason
> to expect it to continue to accelerate in the near
> 1. The inherent intelligence of humans is not changing
> fast on a time scale of hundreds of years
(Actually, it appears to be increasing rapidly, as measured
by a wide variety of IQ tests. See "Flynn effect". (Also,
the number of mental illnesses people develop is increasing
rapidly. I had previously thought this was due to environmental
degradation, but the high correlation between high IQ and
mental illness suggests another factor.))
> 3. Of those humans, an increasing fraction are
> receiving advanced education, leading to more
> scientists and engineers
There are 2 popular ways of trying to evaluate the rate of
progress. One involves measuring inputs: number of
PhDs granted, number of scientific journals, etc.
This generally shows an exponential increase (although
the datapoints from the 19th century to make that
exponential appear to have been made up out of thin
air by de Solla Price (/Little Science, Big Science/ (1963)),
by calculating the curve for the 20th century and
Another involves measuring
outputs, by trying to evaluate the significance of advances,
counting papers with above a certain number of
references to them, evaluating the
changes in people's lives, or lifespan or leisure
time or wealth increase. This latter generally shows a
rate of change that is linear over time, decreasing since
about 1970, and exponentially decreasing at all times
going back thousands of years when phrased
in terms of change per person or effective output per
researcher. See "Rescher's law of logarithmic returns",
which states that the outputs achieved to date in any
technical field are a logarithmic function of the inputs
that have so far been devoted to it.
The notion that the rate of change is increasing is
appealing, but has no basis in fact as far as I know.
It is a claim that can be made only in complete
ignorance of history.
Compare 1970 to 2006 - that's a 36-year timespan.
Then compare 1934 to 1970, or 1888 to 1934,
or 1852 to 1888, or 1816 to 1852, or 1780 to 1816,
or 1744 to 1780, or 1708 to 1744, or 1672 to 1708 -
I don't think that anyone with any knowledge of any
one of those time periods could say anything but
that the technological, epistemological, and cultural
change during them - in Western society - was
greater than during the period 1970-2006.
I think something else - probably the time devoted
to life overhead such as taxes, automobile maintenance,
PC maintenance, college education, etc. - is what
is actually increasing, causing stress and the
impression that the world is changing ever faster.
- Phil Goetz
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT