Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures - Artificial Intelligence Applications and Innovations Access content directly
Conference Papers Year : 2019

Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures

Abstract

Autonomous design and optimization of neural networks is gaining increasingly more attention from the research community. The main barrier is the computational resources required to conduct experimental and production project. Although most researchers focus on new design methodologies, the main computational cost remains the evaluation of candidate architectures. In this paper we investigate the feasibility of using reduced epoch training, by measuring the rank correlation coefficients between sets of optimizers, given a fixed number of training epochs. We discover ranking correlations of more than 0.75 and up to 0.964 between Adam with 50 training epochs, stochastic gradient descent with nesterov momentum with 10 training epochs and Adam with 20 training epochs. Moreover, we show the ability of genetic algorithms to find high-quality solutions of a function, by searching in a perturbed search space, given that certain correlation criteria are met.
Fichier principal
Vignette du fichier
483292_1_En_22_Chapter.pdf (415.92 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-02331350 , version 1 (24-10-2019)

Licence

Attribution

Identifiers

Cite

George Kyriakides, Konstantinos Margaritis. Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures. 15th IFIP International Conference on Artificial Intelligence Applications and Innovations (AIAI), May 2019, Hersonissos, Greece. pp.272-281, ⟨10.1007/978-3-030-19823-7_22⟩. ⟨hal-02331350⟩
64 View
41 Download

Altmetric

Share

Gmail Facebook X LinkedIn More