Electronics, Vol. 13, Pages 1778: An Improvement of Adam Based on a Cyclic Exponential Decay Learning Rate and Gradient Norm Constraints

2 weeks ago 20

Electronics, Vol. 13, Pages 1778: An Improvement of Adam Based on a Cyclic Exponential Decay Learning Rate and Gradient Norm Constraints

Electronics doi: 10.3390/electronics13091778

Authors: Yichuan Shao Jiapeng Yang Wen Zhou Haijing Sun Lei Xing Qian Zhao Le Zhang

Aiming at a series of limitations of the Adam algorithm, such as hyperparameter sensitivity and unstable convergence, in this paper, an improved optimization algorithm, the Cycle-Norm-Adam (CN-Adam) algorithm, is proposed. The algorithm integrates the ideas of a cyclic exponential decay learning rate (CEDLR) and gradient paradigm constraintsand accelerates the convergence speed of the Adam model and improves its generalization performance by dynamically adjusting the learning rate. In order to verify the effectiveness of the CN-Adam algorithm, we conducted extensive experimental studies. The CN-Adam algorithm achieved significant performance improvementsin both standard datasets. The experimental results show that the CN-Adam algorithm achieved 98.54% accuracy in the MNIST dataset and 72.10% in the CIFAR10 dataset. Due to the complexity and specificity of medical images, the algorithm was tested in a medical dataset and achieved an accuracy of 78.80%, which was better than the other algorithms. The experimental results show that the CN-Adam optimization algorithm provides an effective optimization strategy for improving model performance and promoting medical research.

Read Entire Article