Curvature-corrected learning dynamics in deep neural networks

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »Supplemental »

Bibtek download is not availble in the pre-proceeding


Authors

Dongsung Huh

Abstract

<p>Deep neural networks exhibit highly non-convex loss landscape, which results in complex learning dynamics under steepest gradient descent. Second order optimization methods, such as natural gradient descent, can facilitate learning by compensating for ill-conditioned curvature. However, the exact nature of such curvature-corrected learning process remains largely unknown. Here, we derive exact solutions to curvature-corrected learning rules for the restricted case of deep linear neural networks. Our analysis reveals that natural gradient descent follows the same path as gradient descent, only adjusting the temporal dynamics along the path. This preserves the implicit bias of gradient-based learning, such as weight balance across layers. However, block-diagonal approximations of natural gradient, which are widely used in most second order methods (e.g. K-FAC), significantly distort the dynamics to follow highly divergent paths, destroying weight balance across layers. We introduce partially curvature-corrected learning rule, which provides most of the benefit of full curvature correction in terms of convergence speed with superior numerical stability while preserving the core property of gradient descent under block-diagonal approximations.</p>