IEICE Technical Committee Submission System
Conference Paper's Information
Online Proceedings
[Sign in]
Tech. Rep. Archives
 Go Top Page Go Previous   [Japanese] / [English] 

Paper Abstract and Keywords
Presentation 2022-01-24 15:55
Accelerating Deep Neural Networks on Edge Devices by Knowledge Distillation and Layer Pruning
Yuki Ichikawa, Akira Jinguji, Ryosuke Kuramochi, Hiroki Nakahara (Titech) VLD2021-58 CPSY2021-27 RECONF2021-66
Abstract (in Japanese) (See Japanese page) 
(in English) A deep neural network (DNN) is computationally expensive, making it challenging to run DNN on edge devices. Therefore, model compression techniques such as knowledge distillation and pruning are proposed. This research suggests an efficient method to compress pretrained models using these techniques. We show that our method can compress models for edge devices in a short time. We also show a trade--off between recognition accuracy and inference time on Jetson Nano GPU and DPU on a Xilinx FPGA.
Keyword (in Japanese) (See Japanese page) 
(in English) Knowledge Distillation / Layer Pruning / Deep Neural Network / Edge Device / / / /  
Reference Info. IEICE Tech. Rep., vol. 121, no. 344, RECONF2021-66, pp. 49-54, Jan. 2022.
Paper # RECONF2021-66 
Date of Issue 2022-01-17 (VLD, CPSY, RECONF) 
ISSN Online edition: ISSN 2432-6380
Copyright
and
reproduction
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
Download PDF VLD2021-58 CPSY2021-27 RECONF2021-66

Conference Information
Committee RECONF VLD CPSY IPSJ-ARC IPSJ-SLDM  
Conference Date 2022-01-24 - 2022-01-25 
Place (in Japanese) (See Japanese page) 
Place (in English) Online 
Topics (in Japanese) (See Japanese page) 
Topics (in English) FPGA Applications, etc. 
Paper Information
Registration To RECONF 
Conference Code 2022-01-RECONF-VLD-CPSY-ARC-SLDM 
Language Japanese 
Title (in Japanese) (See Japanese page) 
Sub Title (in Japanese) (See Japanese page) 
Title (in English) Accelerating Deep Neural Networks on Edge Devices by Knowledge Distillation and Layer Pruning 
Sub Title (in English)  
Keyword(1) Knowledge Distillation  
Keyword(2) Layer Pruning  
Keyword(3) Deep Neural Network  
Keyword(4) Edge Device  
Keyword(5)  
Keyword(6)  
Keyword(7)  
Keyword(8)  
1st Author's Name Yuki Ichikawa  
1st Author's Affiliation Tokyo Institute of Technology (Titech)
2nd Author's Name Akira Jinguji  
2nd Author's Affiliation Tokyo Institute of Technology (Titech)
3rd Author's Name Ryosuke Kuramochi  
3rd Author's Affiliation Tokyo Institute of Technology (Titech)
4th Author's Name Hiroki Nakahara  
4th Author's Affiliation Tokyo Institute of Technology (Titech)
5th Author's Name  
5th Author's Affiliation ()
6th Author's Name  
6th Author's Affiliation ()
7th Author's Name  
7th Author's Affiliation ()
8th Author's Name  
8th Author's Affiliation ()
9th Author's Name  
9th Author's Affiliation ()
10th Author's Name  
10th Author's Affiliation ()
11th Author's Name  
11th Author's Affiliation ()
12th Author's Name  
12th Author's Affiliation ()
13th Author's Name  
13th Author's Affiliation ()
14th Author's Name  
14th Author's Affiliation ()
15th Author's Name  
15th Author's Affiliation ()
16th Author's Name  
16th Author's Affiliation ()
17th Author's Name  
17th Author's Affiliation ()
18th Author's Name  
18th Author's Affiliation ()
19th Author's Name  
19th Author's Affiliation ()
20th Author's Name  
20th Author's Affiliation ()
Speaker Author-1 
Date Time 2022-01-24 15:55:00 
Presentation Time 25 minutes 
Registration for RECONF 
Paper # VLD2021-58, CPSY2021-27, RECONF2021-66 
Volume (vol) vol.121 
Number (no) no.342(VLD), no.343(CPSY), no.344(RECONF) 
Page pp.49-54 
#Pages
Date of Issue 2022-01-17 (VLD, CPSY, RECONF) 


[Return to Top Page]

[Return to IEICE Web Page]


The Institute of Electronics, Information and Communication Engineers (IEICE), Japan