IEICE Technical Committee Submission System
Conference Paper's Information
Online Proceedings
[Sign in]
Tech. Rep. Archives
 Go Top Page Go Previous   [Japanese] / [English] 

Paper Abstract and Keywords
Presentation 2019-09-28 16:00
A comparison of Japanese pretrained BERT models
Naoki Shibayama, Rui Cao, Jing Bai, Wen Ma, Hiroyuki Shinnou (Ibaraki Univ.) NLC2019-24
Abstract (in Japanese) (See Japanese page) 
(in English) BERT is useful pre-training method for neural languages. There are pre-trained models for English which was used in a paper of BERT. Now, There are 3 Japanese pre-trained models which uses SentencePiece, Juman++ with BPE, or MeCab with NEologd to separate input texts by lexical. In this paper, we compared these 3 models with a sentiment analysis task as a subject.
Keyword (in Japanese) (See Japanese page) 
(in English) machine learning / BERT / pre-trained model / natural language processing / / / /  
Reference Info. IEICE Tech. Rep., vol. 119, no. 212, NLC2019-24, pp. 89-92, Sept. 2019.
Paper # NLC2019-24 
Date of Issue 2019-09-20 (NLC) 
ISSN Online edition: ISSN 2432-6380
Copyright
and
reproduction
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
Download PDF NLC2019-24

Conference Information
Committee NLC IPSJ-DC  
Conference Date 2019-09-27 - 2019-09-28 
Place (in Japanese) (See Japanese page) 
Place (in English) Future Corporation 
Topics (in Japanese) (See Japanese page) 
Topics (in English) The Thirteenth Text Analytics Symposium 
Paper Information
Registration To NLC 
Conference Code 2019-09-NLC-DC 
Language Japanese 
Title (in Japanese) (See Japanese page) 
Sub Title (in Japanese) (See Japanese page) 
Title (in English) A comparison of Japanese pretrained BERT models 
Sub Title (in English)  
Keyword(1) machine learning  
Keyword(2) BERT  
Keyword(3) pre-trained model  
Keyword(4) natural language processing  
Keyword(5)  
Keyword(6)  
Keyword(7)  
Keyword(8)  
1st Author's Name Naoki Shibayama  
1st Author's Affiliation Ibaraki University (Ibaraki Univ.)
2nd Author's Name Rui Cao  
2nd Author's Affiliation Ibaraki University (Ibaraki Univ.)
3rd Author's Name Jing Bai  
3rd Author's Affiliation Ibaraki University (Ibaraki Univ.)
4th Author's Name Wen Ma  
4th Author's Affiliation Ibaraki University (Ibaraki Univ.)
5th Author's Name Hiroyuki Shinnou  
5th Author's Affiliation Ibaraki University (Ibaraki Univ.)
6th Author's Name  
6th Author's Affiliation ()
7th Author's Name  
7th Author's Affiliation ()
8th Author's Name  
8th Author's Affiliation ()
9th Author's Name  
9th Author's Affiliation ()
10th Author's Name  
10th Author's Affiliation ()
11th Author's Name  
11th Author's Affiliation ()
12th Author's Name  
12th Author's Affiliation ()
13th Author's Name  
13th Author's Affiliation ()
14th Author's Name  
14th Author's Affiliation ()
15th Author's Name  
15th Author's Affiliation ()
16th Author's Name  
16th Author's Affiliation ()
17th Author's Name  
17th Author's Affiliation ()
18th Author's Name  
18th Author's Affiliation ()
19th Author's Name  
19th Author's Affiliation ()
20th Author's Name  
20th Author's Affiliation ()
Speaker Author-1 
Date Time 2019-09-28 16:00:00 
Presentation Time 25 minutes 
Registration for NLC 
Paper # NLC2019-24 
Volume (vol) vol.119 
Number (no) no.212 
Page pp.89-92 
#Pages
Date of Issue 2019-09-20 (NLC) 


[Return to Top Page]

[Return to IEICE Web Page]


The Institute of Electronics, Information and Communication Engineers (IEICE), Japan