From: Joel Peter William Pitt (joel.pitt@gmail.com)
Date: Sun Jul 10 2005 - 17:31:15 MDT
On 7/10/05, Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
> The point of no return is the enemy that is substantially smarter than you
> are. Opposing an entity dumber than me, I have a real chance of winning no
> matter what apparent advantages they start out with. Against a better mind
> than mine, I would not expect to win regardless of material advantages.
I'd have to disagree.
Pure intelligence has use and greatly influences the odds of a battle
when strategizing. But lets say we throw you up against 1000 ravenous
rabid rats, these taken on their own I'm sure everyone would agree are
less intelligent than Eliezer. The weight of numbers in warfare can't
just be ignored. Sure the intelligence difference between us and other
mammals is probably very small in comparison to the difference between
an AGI and us, but there would be around 5-6 billion of us vs a lone
AGI, so I think it balances out (at least for the beginning stages of
take off).
There are of course differences to the situation we are considering
and the above - since assumedly a threatening AGI would have some
influence over the physical realm and may be able to create minions to
assist it.
Back to Phillip's original question about when is it too late to do
anything... well I think when di/dt (i = difference in intelligence)
is faster than our ability to hold parlimentory sessions to decide
what to do, then we are all doomed. ;P
Joel
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT