Modes of Homogeneous Gradient Flows

Ido Cohen et al. SIAM

Finding latent structures in data is drawing increasing attention in diverse fields such as image and signal processing, fluid dynamics, and machine learning. In this work we examine the problem of finding the main modes of gradient flows. Gradient descent is a fundamental process in optimization where its stochastic version is prominent in training of neural networks. Here our aim is to establish a consistent theory for gradient flows ψt=P(ψ), where P is a nonlinear homogeneous operator. Our proposed framework stems from analytic solutions of homogeneous flows, previously formalized by Cohen-Gilboa, where the initial condition ψ0 admits the nonlinear eigenvalue problem P(ψ0)=λψ0. We first present an analytic solution for \ac{DMD} in such cases. We show an inherent flaw of \ac{DMD}, which is unable to recover the essential dynamics of the flow. It is evident that \ac{DMD} is best suited for homogeneous flows of degree one. We propose an adaptive time sampling scheme and show its dynamics are analogue to homogeneous flows of degree one with a fixed step size. Moreover, we adapt \ac{DMD} to yield a real spectrum, using symmetric matrices. Our analytic solution of the proposed scheme recovers the dynamics perfectly and yields zero error. We then proceed to show that in the general case the orthogonal modes {ϕi} are approximately nonlinear eigenfunctions P(ϕ)λϕ. We formulate Orthogonal Nonlinear Spectral decomposition (\emph{OrthoNS}), which recovers the essential latent structures of the gradient descent process. Definitions for spectrum and filtering are given, and a Parseval-type identity is shown.