From: Brian Atkins (brian@posthuman.com)
Date: Wed Mar 12 2003 - 11:02:32 MST
I think if you attempt this argument at a surface layer it will be
difficult... better to dig deeper into specific trends such as computing
power, and try to determine specifically why they will or won't continue
trending. With computing power you can trace its progress through at
least 4 or 5 different technologies/paradigms, from mechanical relays to
transistors to microchips. Each time a particular technique for
computing maxed out, a new way came along.
We are running into an interesting spot currently where chip
manufacturing is running into a bit of a wall economically. The problem
is a combination of the fact that it costs more and more money for
companies to switch to each new advanced lithography process, while at
the same time transistor tech in general has gotten good enough that
many classes of devices work perfectly well using older manufacturing
processes (in many cases they actually may work _worse_ if translated to
the new processes due to increased current leakage and other reasons).
So the effect for instance at foundry companies like Taiwan Semi is that
there are fewer and fewer customers interested in paying for more and
more advanced processes due to both the cost and the lack of need. I
can't post it due to copyright issues, but this was described with some
nice graphs in the Dec 02 Gilder Tech Report. They call the idea the
"value transistor" - each particular computing device has a particular
litho process that its price/performance peaks on.
So is Moore's Law going kaput soon due to economics? At the moment I
don't think so. Other, more leading edge, areas of computing aside from
foundries, from supercomputing to PC CPUs still seem to have enough need
for the upcoming litho processes that we should continue onwards down
past 90nm without any worries. Then later on in this decade there will
likely begin to be some serious alternate manufacturing processes (there
are many ways under development and research currently) that will either
take us into a new phase past "microchips" or will at least make the
economics and performance of new chip processes dramatically more
appealing. There is a legitimate worry that software needs will not keep
pace and that demand for higher performance graphics and "PCs" will die
off, but I think gaming and other advanced uses will save the day. (I'm
looking forward to see what that 1 teraflop PlayStation 3 will be able
to do :-) Supercomputers are iffy too, but I think there will be enough
government and other demand to keep them advancing at least through this
decade.
-- Brian Atkins Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT