From: Thomas McCabe (pphysics141@gmail.com)
Date: Tue Jan 29 2008 - 13:34:39 MST
Desirability / getting there
* There's no reason for anybody to want to build a superhuman AI.
o Rebuttal synopsis: A superintelligent AI, by definition,
would be able to do *anything* faster than any human can. No matter
what you want to do, superintelligent AI can help you do it better.
* A Singularity through uploading/BCI would be more feasible/desirable.
* Life would have no meaning in a universe with AI/advanced
nanotech (see Bill McKibben).
o Rebuttal synopsis: If we wanted to, we could always choose
not to use advanced technologies, or just keep them running in the
background to protect us from asteroids and what not.
* A real AI would turn out just like (insert scenario from sci-fi
book or movie).
o Rebuttal synopsis: Science fiction is entertainment, not
an actual prediction of how things will turn out. You can't generalize
from fictional evidence.
* Technology has given us nuclear bombs/industrial slums/etc.; the
future should involve less technology, not more.
o Rebuttal synopsis: Because of technology, the average
quality of life is much, much better than it was (say) a thousand
years ago. If we wanted to, we could throw out all of our computers
and cellphones tomorrow. We choose not to, because we know that
technology improves our lives.
* We might live in a computer simulation and it might be too
computationally expensive for our simulators to simulate our world
post-Singularity.
o Rebuttal synopsis: This scenario can be used to argue for,
or against, any idea whatsoever. For idea X, just say "What if the
simulators killed us if we did X?", or "What if the simulators killed
us if we didn't do X?".
* AI is too long-term a project, we should focus on short-term
goals like curing cancer.
o Rebuttal synopsis: AI could actually wind up being easier
than curing cancer, at least in terms of money and man-hours involved.
And the impact of AI is huge- it could cure every disease known to
humankind, as well as solve a whole bunch of other problems.
* Unraveling the mystery of intelligence would demean the value of
human uniqueness.
o Rebuttal synopsis: Many, many scientific advances have
made humans seem less special (Copernicus, Darwin, etc.) With
hindsight, we still see these advances as good things.
* If this was as good as it sounds, someone else would already be
working on it.
o Rebuttal synopsis: Every great idea was passed over
thousands of times before someone got around to working on it. Every
new startup company depends on the principle that an idea can be good,
and yet not taken by someone else. And startups are now one of the
main drivers of our economy- all five of the Internet's most-visited
websites were originally startup companies.
* Singularity utopias are all written in an elite Western
intellectual culture: a Singularity and machines taking over will
threaten the diversity of other forms of thought, such as religion and
less technology-based cultures.
o With the Internet, we have seen an explosion of different
subcultures, including religious and even anti-technology ones -
better communications allow ideas to spread faster, and a Singularity
is likely to do that even more effectively. Furthermore, a
Friendliness model such as CEV would preserve existing cultures to the
extent that humans ultimately wish to see them preserved.
- TOm
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT