Paper Abstract and Keywords |
Presentation |
2014-03-14 15:30
Experimental study on effect of pre-training in deep learning through visualization of unit outputs Tsubasa Ochiai (Doshisha Univ./NICT), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.), Shigeki Matsuda, Chiori Hori (NICT) PRMU2013-210 |
Abstract |
(in Japanese) |
(See Japanese page) |
(in English) |
To clarify the capability of recent powerful classifier concept, Deep Neural Networks (DNN), we experimentally
investigate effects of the pre-training used to initialize DNN. A deep neural network is first pre-trained using
Restricted Boltzmann Machine (RBM), then it is run as an embodiment of Deep Belief Networks, which basically
possess associative memory function, and a Deep Autoencoder, which is expected to realize feature representation
for an input pattern over the inner layers of network. Analyses are conducted through the visualization of network
unit outputs. Based on the experiments, we reveal that the RBM-based pre-training successfully makes networks
memorize some information of training patterns and also represent pattern features inside the networks. |
Keyword |
(in Japanese) |
(See Japanese page) |
(in English) |
Deep Learning / Pre-training / Visualization of unit outputs / Deep Neural Networks / / / / |
Reference Info. |
IEICE Tech. Rep., vol. 113, no. 493, PRMU2013-210, pp. 253-258, March 2014. |
Paper # |
PRMU2013-210 |
Date of Issue |
2014-03-06 (PRMU) |
ISSN |
Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380 |
Copyright and reproduction |
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
Download PDF |
PRMU2013-210 |
|