Proceedings of the 2nd International Conference on Mechatronics Engineering and Information Technology (ICMEIT 2017)

Compression of Conditional Deep Learning Network for Fast and Low Power Mobile Applications

Authors
Lijie Li, Yan Zhang, Pengfei Wang
Corresponding Author
Lijie Li
Available Online May 2017.
DOI
10.2991/icmeit-17.2017.33How to use a DOI?
Keywords
CDLN, one-shot whole network compression scheme, module size.
Abstract

CDLN(Conditional Deep Learning Network)is a structure of convolution neural network with multiple classifiers. CDLN could improve the speed for the task of classification while the module of the network is still too large for mobile devices. To address this issue, a method for compressing CDLN, which is named one-shot whole network compression scheme. In the experiments, the module size and time cost are significantly reduced while the accuracy of the network losses a little.

Copyright
© 2017, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Download article (PDF)

Volume Title
Proceedings of the 2nd International Conference on Mechatronics Engineering and Information Technology (ICMEIT 2017)
Series
Advances in Computer Science Research
Publication Date
May 2017
ISBN
10.2991/icmeit-17.2017.33
ISSN
2352-538X
DOI
10.2991/icmeit-17.2017.33How to use a DOI?
Copyright
© 2017, the Authors. Published by Atlantis Press.
Open Access
This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).

Cite this article

TY  - CONF
AU  - Lijie Li
AU  - Yan Zhang
AU  - Pengfei Wang
PY  - 2017/05
DA  - 2017/05
TI  - Compression of Conditional Deep Learning Network for Fast and Low Power Mobile Applications
BT  - Proceedings of the 2nd International Conference on Mechatronics Engineering and Information Technology (ICMEIT 2017)
PB  - Atlantis Press
SP  - 183
EP  - 186
SN  - 2352-538X
UR  - https://doi.org/10.2991/icmeit-17.2017.33
DO  - 10.2991/icmeit-17.2017.33
ID  - Li2017/05
ER  -