Neural Network Compression Through Shunt Connections and Knowledge Distillation for Semantic Segmentation Problems - Artificial Intelligence Applications and Innovations Access content directly
Conference Papers Year : 2021

Neural Network Compression Through Shunt Connections and Knowledge Distillation for Semantic Segmentation Problems

Bernhard Haas
  • Function : Author
  • PersonId : 1105389
Alexander Wendt
  • Function : Author
  • PersonId : 1056774
Axel Jantsch
  • Function : Author
  • PersonId : 1105390
Matthias Wess
  • Function : Author
  • PersonId : 1105391

Abstract

Employing convolutional neural network models for large scale datasets represents a big challenge. Especially embedded devices with limited resources cannot run most state-of-the-art model architectures in real-time, necessary for many applications. This paper proves the applicability of shunt connections on large scale datasets and narrows this computational gap. Shunt connections is a proposed method for MobileNet compression. We are the first to provide results of shunt connections for the MobileNetV3 model and for segmentation tasks on the Cityscapes dataset, using the DeeplabV3 architecture, on which we achieve compression by 28%, while observing a 3.52 drop in mIoU. The training of shunt-inserted models are optimized through knowledge distillation. The full code used for this work will be available online.
Embargoed file
Embargoed file
0 7 0
Year Month Jours
Avant la publication

Dates and versions

hal-03287657 , version 1 (15-07-2021)

Licence

Attribution - CC BY 4.0

Identifiers

Cite

Bernhard Haas, Alexander Wendt, Axel Jantsch, Matthias Wess. Neural Network Compression Through Shunt Connections and Knowledge Distillation for Semantic Segmentation Problems. 17th IFIP International Conference on Artificial Intelligence Applications and Innovations (AIAI), Jun 2021, Hersonissos, Crete, Greece. pp.349-361, ⟨10.1007/978-3-030-79150-6_28⟩. ⟨hal-03287657⟩
29 View
27 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More