Sparsely coded associative memories: capacity and dynamical properties

Claude Meunier, Hiro-Fumi Yanai, and Shun-ichi Amari


abstract

We consider very sparsely coded associative memories of binary neurons, for both Hebbian and covariant learning rules. We calculate explicitly their maximal capacity both in terms of patterns, and in terms of information content, taking into account the correlation of local fields, and we investigate its dependence on the degree of sparsity. The sparseness of the coding enhances both the memory capacity and the information capacity, whatever the chosen scheme. The study of the retrieval dynamics shows that, as soon as the number of patterns stored exceeds some critical value, retrieval becomes limited to the states with the same activity as the prototype patterns.



Back to Publications