From: Michael Vassar (firstname.lastname@example.org)
Date: Tue Dec 13 2005 - 20:11:59 MST
>"Whatever" also includes private AI groups. Maybe the probability that
>"they" will listen to "us" is small, but I think the probability that "we"
>will create the first AGI is also small.
The first probability isn't small, it's negligible. The second probability
is probably small as in <50%, but given a reasonable definition of "we" it
appears very substantial. There is a VERY strong correlation between the
ability to build an AI and the ability to basically understand the
singularity. Given a wide set of environments, the latter ability will
result in ending up here.
At any rate, I think that the chance of "us" making a difference directly is
FAR greater than that of "us" making a difference through our influence.
>The cases are different. Opting to go for Oracle first does need not to
>mean giving up a powerful technology that adversaries would then inquire
>instead, nor does it require global coordination.
It requires both. Global coordination so no-one else does a harder take-off
seed AI and the risk of someone else doing a harder takeoff seed AI first
since by going for an Oracle you are slowing down.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT