Network with Sub-networks: Layer-wise Detachable Neural Network
- 10.2991/jrnal.k.201215.006How to use a DOI?
- Model compression; neural networks; multilayer perceptron; supervised learning
In this paper, we introduce a network with sub-networks: a neural network whose layers can be detached into sub-neural networks during the inference phase. To develop trainable parameters that can be inserted into both base-model and sub-models, first, the parameters of sub-models are duplicated in the base-model. Each model is separately forward-propagated, and all models are grouped into pairs. Gradients from selected pairs of networks are averaged and used to update both networks. With the Modified National Institute of Standards and Technology (MNIST) and Fashion-MNIST datasets, our base-model achieves identical test-accuracy to that of regularly trained models. However, the sub-models result in lower test-accuracy. Nevertheless, the sub-models serve as alternative approaches with fewer parameters than those of regular models.
- © 2020 The Authors. Published by Atlantis Press B.V.
- Open Access
- This is an open access article distributed under the CC BY-NC 4.0 license (http://creativecommons.org/licenses/by-nc/4.0/).
Download article (PDF)
View full text (HTML)
Cite this article
TY - JOUR AU - Ninnart Fuengfusin AU - Hakaru Tamukoh PY - 2020 DA - 2020/12/21 TI - Network with Sub-networks: Layer-wise Detachable Neural Network JO - Journal of Robotics, Networking and Artificial Life SP - 240 EP - 244 VL - 7 IS - 4 SN - 2352-6386 UR - https://doi.org/10.2991/jrnal.k.201215.006 DO - 10.2991/jrnal.k.201215.006 ID - Fuengfusin2020 ER -