Re: Donate Today and Tomorrow

From: Samantha Atkins (samantha@objectent.com)
Date: Sun Oct 24 2004 - 02:31:03 MDT


On Oct 21, 2004, at 4:53 PM, Slawomir Paliwoda wrote:
>

> And no, I'm not trying to imply anything about cults here, but I'm
> trying to point out the common factor between the two organizations
> which is that, assuming it's next to impossible to truly understand
> CFAI and LOGI, commitment to these projects requires faith in
> implementation and belief that the means will lead to intended end.
> One cannot aspire to rationalism and rely on faith at the same time.
>

But it is not impossible to understand CFAI or LOGI. Some stretches in
places, lots of questions that might be answered but the answer is not
immediately obvious, but certainly not impossible to understand.

> I've noticed a Matrix quote in your essay, ("Don't think you are, know
> you are"). There is an equally interesting quote from Reloaded you
> might agree with, and it is when Cornel West responds to one of Zion's
> commanders, "Comprehension is not requisite for cooperation." And even
> though I'm convinced that Matrix trilogy is an overlooked masterpiece,
> much farther ahead of its time than Blade Runner ever hoped to be, I
> don't think Mr. West was correct. Comprehension is indeed a requisite
> for cooperation, and as long as you are unable to find a way to
> overcome the "comprehension" requirement, I don't think you should
> expect to find donors who don't understand exactly what you are doing
> and how.
>

Eliezer will no doubt answer this. But why exactly is full
comprehension necessary? Isn't all that is really necessary sufficient
understanding to see the rightness of the Cause plus a reasoned believe
that SIAI can actually achieve or take a good stab at achieving it?
Now, I don't know that SIAI will succeed. I believe Eliezer is a
brilliant theorist but that he cannot succeed without either becoming
or recruiting some very strong software architects and implementation
experts. I am not at all sure he is able to or interested in the
leading of such a team.

I also believe that much more effective evangelism that reaches people
of various levels of understanding will be required to raise sufficient
funds and sufficient passion in the required very gifted people.
Eliezer has expressed a lack of interest in this area. Possibly he is
correct that he can't do this work or that it is a terrible
distraction. Without such multiplying effect of an effective
evangelistic spread of support there will not be much hope of turning
even the best of theories into actual reality.

Donating to the SIAI at least gives it a chance to get strong enough to
recursively self-improve into an organization fully equipped for the
work.

> Let's say I'm a potential donor. How do I know, despite sincere
> intentions of the organization to save the world, that the world won't
> "drift toward tragedy" as a result of FAI research made possible in
> part by my donation? How do I know what you know to be certain without
> spending next 5 years studying?
>

You don't. But you probably do know that the world as it is currently
drifting is all too likely bound for great disaster - disaster large
enough to potentially destroy humanity or at least set it back
generations and kill billions. I have thought that there must be
other ways to avoid such an outcome, ways that do not involve something
as quickly beyond our control. But the world is already evolved beyond
our control and is inadequately intelligent and inadequately
intelligible to minds as limited as our own.

>
> Other questions:
> Why SIAI team would need so much money to continue building FAI if the
> difficulty of creating it does not lie in hardware? What are the real
> costs?
>

Besides the obvious that this current request was to retain a favorable
tax status, the work requires a great deal of very fine brains working
quite hard and effectively for some time. That takes working capital
to create and maintain. That the hardware is not the difficult part
does not mean that a considerable sum in mostly currently existing
hardware is not required.

> Increased donations give you greater power to influence the world. Do
> you see anything wrong in entrusting a small group of people with the
> fate of entire human race? What would you tell people objecting to
> that idea? Do we have the right to end the world as we know it without
> their approval?
>

Do you see anything wrong with people going about their merry way
without bothering to note or respond to imminent disaster? Do you see
harm in those that do see it coming deciding to do nothing at all? Do
you require that a very large group be present and democratically
accountable to everyone, including all those you said were incapable of
understanding, before anything can be done to address the situation?

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:49 MDT