From: Dan Clemmensen (dgc@cox.rr.com)
Date: Sat Jan 26 2002 - 19:16:40 MST
Ben Goertzel wrote
> I was explaining some Singularity stuff to my kids this morning, and my
> 12 year old said "Yes, but what if after a machine gets ten times as
> smart as people, IT figures out a reason why it's incredibly hard to make
> machines any smarter than that... a reason that we can't understand because
> we're too dumb."
Yes my 11-year-old and her friends are also have interesting insights.
When she was ten, she argued against becoming SI because she feared
boredom in that state.
>
> Coincidentally, last night I was doing some simple calculations regarding
> the
> time complexity of learning computer programs and I obtained some results
> that sort of point in this direction. Of course, it's all very heuristic
> and inconclusive, but it's food for thought...
>
[snipped description of growth behavior of two algorithms as examples.]
These are interesting examples, but as you point out, we do not yet know
the nature of the algorithms or non-digital techniques that an SI
may ultimately employ. I think we'll get modestly superior intelligence
using "standard" computers, and that this will get us to AGI and another
level of SI, but that SI can then pursue alternative strategies such as
holographic, quantum, and analog techniques, which have different
constraints.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT