Open Access System for Information Sharing

Login Library

 

Article
Cited 2 time in webofscience Cited 2 time in scopus
Metadata Downloads

Smooth momentum: improving lipschitzness in gradient descent SCIE SCOPUS

Title
Smooth momentum: improving lipschitzness in gradient descent
Authors
BUM, JUN KIMHYEYEON, CHOIJANG, HYEONAHKIM, SANG WOO
Date Issued
2022-10
Publisher
Kluwer Academic Publishers
Abstract
Deep neural network optimization is challenging. Large gradients in their chaotic loss landscape lead to unstable behavior during gradient descent. In this paper, we investigate a stable gradient descent algorithm. We revisit the mathematical derivations of the Momentum optimizer and discuss the potential problem for steep walls. Inspired by the physical motion of the mass, we propose Smooth Momentum, a new optimizer that improves the behavior on steep walls. We mathematically analyze the characteristics of the proposed optimizer and prove that Smooth Momentum exhibits improved Lipschitz properties and convergence, which allows stable and faster convergence in gradient descent. We also demonstrate how Smooth Gradient, a component of the proposed optimizer, can be plugged into other optimizers, like Adam. The proposed method offers a regularization effect comparable to batch normalization or weight decay. Experiments demonstrate that our proposed optimizer significantly improves the optimization of transformers, convolutional neural networks, and non-convex functions for various tasks and datasets.
URI
https://oasis.postech.ac.kr/handle/2014.oak/115387
DOI
10.1007/s10489-022-04207-7
ISSN
0924-669X
Article Type
Article
Citation
Applied Intelligence, 2022-10
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher

김상우KIM, SANG WOO
Dept of Electrical Enginrg
Read more

Views & Downloads

Browse