From: Eliezer S. Yudkowsky (email@example.com)
Date: Fri Aug 19 2005 - 10:37:44 MDT
Phil Goetz wrote:
> Re. Eliezer & Richard's comments about needing people
> who are members of all 6, 7, 8, or 9 communities -
> That isn't how things are done. When NASA wanted to
> put a man on the moon, they didn't say, "Hey, let's
> find some people who understand propulsion, materials
> science, rocketry, bioscience, astronomy, radio
> communications, and computer science." They found
> a bunch of people, each with expertise in some things,
> and some people with the ability to manage those people
> and get them to work together.
NASA was trying to solve, as a matter of engineering, a problem whose
fundamental physics were well-understood.
> AI has too many lone wolves who are determined to do
> things their way or no way. Everybody wants to be
> Luke Skywalker - the Chosen One who succeeds where
> entire empires have failed. I don't mean just those
> on this list; I mean even the most highly-respected
> and often-quoted AI researchers, whom I will not name
> here because I might want to get a job from one of them
> someday. Maybe AI will actually progress if we can
> cure Skywalker Syndrome.
Bringing order out of scientific chaos is oft-done by teams, but also oft-done
by Luke Skywalkers; such is the lesson of history. Why? One overlooked
reason, I suspect, is that once someone latches on to a piece of the problem
they have an advantage in figuring out the rest also, a first-mover effect.
But also because quite often you *do* need to fit all the knowledge into one
person's head. Some knowledges can only be properly collated using
intercortical bandwitdth, not interpersonal bandwidth. In AI it's not so much
a matter of collation as knowing, for each of the necessary fields, how not to
make the mistakes which that field knows about.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT