Re: Singularity awareness (no news here)

From: Richard Loosemore (rpwl@lightlink.com)
Date: Sat Jun 03 2006 - 10:36:03 MDT


CyTG wrote:
> Im usually a silent reader of this list, and while im a
> singularity-sceptic i rarely speak up on it, because i figure my own
> homemade-grown thinktank is just that, one mans uneducated point of view
> (im a programmer, fiddling with a little ai from time to time, hardly
> qualifies me for anything!).
> Now when someone states something like
>
> "In my estimate, the implementation in neural hardware is not necessary,
> and I believe that we have the hardware to do it now. In fact, I think
> we have had that hardware for about a decade. That is a rough estimate
> based on my understanding of human cognitive architecture, and my take
> on what the design of the first successful AGI will be like."
>
> Im curious, what qualifies you guys to make such a statement ?
> Dont get me wrong, if its just the product of your own
> "uneducated"-homegrown thinktank, thats quite fine, but one could also
> get the impression that you're workign with/educated in the field of
> CS/AI and/or an aspect of the human psyche.
> Just curious :o)

Easy: Approximately 25 years experience as a cognitive science
researcher and software engineer, specializing in the AI/neural
network/machine learning field.

I would be the first to emphasize that what I said above is *my*
assessment, and I know that many disagree with it. But we each bring a
different perspective to the field.

Richard Loosemore

>
>
> On 6/2/06, *Richard Loosemore* <rpwl@lightlink.com
> <mailto:rpwl@lightlink.com>> wrote:
>
>
> If I may: this seems to be an example of what has come to be a standard
> calculation of <When Gate-Count Will Equal Neuron-Count>. People come
> up with different numbers, of course, but lots of people do the same
> calculation.
>
> Now, you point out that this is only some kind of upper bound, and that
> it may not be as important as (e.g.) architecture ...... but to my mind
> this kind of calculation is a *complete* distraction, telling us almost
> nothing but making us think that it means something.
>
> In my estimate, the implementation in neural hardware is not necessary,
> and I believe that we have the hardware to do it now. In fact, I think
> we have had that hardware for about a decade. That is a rough estimate
> based on my understanding of human cognitive architecture, and my take
> on what the design of the first successful AGI will be like.
>
>
> Richard Loosemore
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT