RE: Hopfield networks as a model of memory?

From: ben goertzel (ben@goertzel.org)
Date: Thu Apr 18 2002 - 10:45:08 MDT


***
 The standard symmetric-weight
hopfield net stores at most .14n floats worth of memory using n*n neural
net links. Furthermore the networks fill up with memories very quickly,
and then can't learn anything more. If you put caps on the link weights,
you can get a network that continually forgets old memories when its memory
gets full, so it can learn new things. But then the memory capacity goes
down to .05n.
**

I erred, I meant .14n*n and .05n*n

They are inefficient, but not THAT inefficient ;->

ben g



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT