Re: nagging questions

From: Eliezer S. Yudkowsky (
Date: Wed Sep 06 2000 - 09:34:30 MDT

Samantha Atkins wrote:
> Actually I was talking about an engineering project also, memetic
> engineering. You would not have to convert all of humanity. Just a
> high enough percentage of the most powerful and influential.

There are some people who you will just never ever be able to convert, because
they are too damn stubborn and they just don't want to listen to you. Like,
say, the rulers of China. And when it comes to military nanotechnology, that
handful is enough to mess up the planet.

> And why should I believe that it is even possible for a hand full of
> Singularitarians to birth this godling? I would generally think such an
> achievement could only be a capstone on the work of a the best minds of
> the entire race.

I used to think so to back when I wrote CaTAI 1.0. By the time I got to PtS,
it was an open-sourced industry. By the time I got to CaTAI 2.0, I was
thinking in terms of a private research project, say fifty to a hundred people

> But a part of me thinks that if the race cannot get itself together
> enough to avoid destruction then it is no fit parent of a Power. We,
> who cannot deal with and take care of one another, dream of creating a
> magic Genie who will, despite being originally designed by us and in our
> image, care for us and fix all of our problems and brokenness? Doesn't
> that sound just a bit like a form of escapism and the height of wishful
> thinking?

This sounds to me like romantic defeatism. Humanity is up against a problem
that it has no a-priori reason to be capable of solving, a problem where a few
bad guys can mess up a planetful of good guys. As for the Sysop being
designed by us, so what? Clocks are designed by us but they don't war amongst

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT