Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Yibo Jiang, Cengiz Pehlevan


<p>Recent work suggests that overparameterized autoencoders can be trained to implement associative memory via iterative maps. This phenomenon happens when converged input-output Jacobian of the network has all eigenvalue norms strictly below one. In this work, we theoretically analyze this behavior for sigmoid networks by leveraging recent developments in deep learning theories, especially the Neural Tangent Kernel (NTK) theory. We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory. </p>