Stochastic gradient descent with hyperbolic-tangent decay on classification

Bo Yang Hsueh, Wei Li, I-Chen Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Learning rate scheduler has been a critical issue in the deep neural network training. Several schedulers and methods have been proposed, including step decay scheduler, adaptive method, cosine scheduler and cyclical scheduler. This paper proposes a new scheduling method, named hyperbolic-tangent decay (HTD). We run experiments on several benchmarks such as: ResNet, Wide ResNet and DenseNet for CIFAR-10 and CIFAR-100 datasets, LSTM for PAMAP2 dataset, ResNet on ImageNet and Fashion-MNIST datasets. In our experiments, HTD outperforms step decay and cosine scheduler in nearly all cases, while requiring less hyperparameters than step decay, and more flexible than cosine scheduler. Code is available at https://github.com/BIGBALLON/HTD.

Original languageEnglish
Title of host publicationProceedings - 2019 IEEE Winter Conference on Applications of Computer Vision, WACV 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages435-442
Number of pages8
ISBN (Electronic)9781728119755
DOIs
StatePublished - 4 Mar 2019
Event19th IEEE Winter Conference on Applications of Computer Vision, WACV 2019 - Waikoloa Village, United States
Duration: 7 Jan 201911 Jan 2019

Publication series

NameProceedings - 2019 IEEE Winter Conference on Applications of Computer Vision, WACV 2019

Conference

Conference19th IEEE Winter Conference on Applications of Computer Vision, WACV 2019
CountryUnited States
CityWaikoloa Village
Period7/01/1911/01/19

Fingerprint Dive into the research topics of 'Stochastic gradient descent with hyperbolic-tangent decay on classification'. Together they form a unique fingerprint.

Cite this