Data Fine-Pruning: A Simple Way to Accelerate Neural Network Training - Network and Parallel Computing
Conference Papers Year : 2018

Data Fine-Pruning: A Simple Way to Accelerate Neural Network Training

Junyu Li
  • Function : Author
  • PersonId : 1053396
Ligang He
  • Function : Author
  • PersonId : 1053397
Shenyuan Ren
  • Function : Author
  • PersonId : 1053398
Rui Mao
  • Function : Author
  • PersonId : 1053399

Abstract

The training process of a neural network is the most time-consuming procedure before being deployed to applications. In this paper, we investigate the loss trend of the training data during the training process. We find that given a fixed set of hyper-parameters, pruning specific types of training data can reduce the time consumption of the training process while maintaining the accuracy of the neural network. We developed a data fine-pruning approach, which can monitor and analyse the loss trend of training instances at real-time, and based on the analysis results, temporarily pruned specific instances during the training process basing on the analysis. Furthermore, we formulate the time consumption reduced by applying our data fine-pruning approach. Extensive experiments with different neural networks are conducted to verify the effectiveness of our method. The experimental results show that applying the data fine-pruning approach can reduce the training time by around 14.29% while maintaining the accuracy of the neural network.
Fichier principal
Vignette du fichier
477597_1_En_10_Chapter.pdf (300.85 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-02279554 , version 1 (05-09-2019)

Licence

Identifiers

Cite

Junyu Li, Ligang He, Shenyuan Ren, Rui Mao. Data Fine-Pruning: A Simple Way to Accelerate Neural Network Training. 15th IFIP International Conference on Network and Parallel Computing (NPC), Nov 2018, Muroran, Japan. pp.114-125, ⟨10.1007/978-3-030-05677-3_10⟩. ⟨hal-02279554⟩
99 View
124 Download

Altmetric

Share

More