From: James Higgins (firstname.lastname@example.org)
Date: Fri Jun 28 2002 - 13:27:03 MDT
At 08:53 AM 6/28/2002 -0400, Brian Atkins wrote:
>Samantha Atkins wrote:
> > Brian,
> > Please give it a rest. You are making yourself look bad.
Regardless of my personal opinion on the matter, I'd like to hear
Samantha's reasons behind this statement.
>I'm going to be busy today, but hopefully later tonight I can respond to
>Ben's messages. This though I have time for.
>I don't think trying to evoke a little more critical discussion of Ben's
>plans compared to the literal mountains that have bee directed at SIAI
>is making me look bad. I don't think trying to point out to a few people
>here that they need to be a little more careful how they choose to put
>their "AI faith" is making me look bad. I already stressed this is not
>some kind of political rhetoric to debunk the other potential tribal
>chief. If you are seeing something that isn't really there, that's not
Ok, I'll take that "AI faith" reference to at least partially refer to myself.
On this topic, I'll start by saying my views regarding Ben and Eliezer were
not formed capriciously, at random or on a small sampling of information
(though, admittedly, smaller than I would like - but I don't have direct
access to either of them). I've been reading this list since March, 2001,
and I've corresponded with both Eliezer & Ben via private email. Based on
my personal interactions with both of them and reading their interactions
with others on this list:
1) I don't think either of them is 100% correct (or anywhere close,
2) I don't think the personal views or morality of either is applicable to
the population at large
3) Both of them are highly intelligent
4) Both of them are strong supporters of The Singularity
However, over the course of this time and the interactions I have witnessed
Ben Goertzel has displayed more wisdom and vastly more maturity in MY
opinion. Eliezer S. Yudkowsy is very intelligent, but also very young and
The people directly responsible for The Singularity, also in MY opinion,
should at least:
1) Realize that their views are not applicable to everyone
2) Value the views and wishes of people who have conflicting beliefs
3) Have a good degree of empathy with other humans
4) Value human life and free will very highly
5) Want to see The Singularity occur (but not so much as to cloud their
6) Fully realize, believe and know that they could be DEAD WRONG about
anything or everything
7) Be mature and responsible enough to ensure that The Singularity is done
in the best manner possible
I'm quite certain that I've missed some points here. This is not intended
to be an exhaustive list, just a list to illustrate the reasoning behind my
Based on my interactions with Ben & Eliezer, viewing their interactions
with others and reading some of their published works:
Fully satisfies the statement in #4.
Gets partial credit for #1, 2, 5 and slightly less so for #7.
Needs much more work on #3 and #6.
Fully satisfies the statements in #1, 2, 4, 6 and 7
I don't have enough details (yet) to know if he only gets partial
or full credit for statements #3 & #5.
This is about as plain and laid out as I can put it. Hopefully this helps
clarify my views and the reasons for which I hold them.
Please note that this is ONLY MY OPPINION. I've been completely wrong
about things in the past and I don't see any reason why this could not be
the case here. Obviously, at present I don't think I'm wrong - or I'd have
reasonable amount of experience (not the best word) with humanity as a
whole. Time and human interaction breed empathy.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT