|
|
All Technical Committee Conferences (Searched in: All Years)
|
|
Search Results: Conference Papers |
Conference Papers (Available on Advance Programs) (Sort by: Date Descending) |
|
Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
NLC |
2023-09-06 14:10 |
Osaka |
Osaka Metropolitan University. Nakamozu Campus. (Primary: On-site, Secondary: Online) |
Construction and Validation of Pre-trained Language Model Using Corpus of National and Local Assembly Minutes Keiyu Nagafuchi (HU), Eisaku Sato, Yasutomo Kimura (OUC), Kazuma Kadowaki (JRI), Kenji Araki (HU) NLC2023-3 |
In recent years, there has been a surge in pre-trained language models based on the large-scale corpora derived from the... [more] |
NLC2023-3 pp.12-17 |
NLC, IPSJ-NL |
2023-03-18 11:45 |
Okinawa |
OIST (Primary: On-site, Secondary: Online) |
Detection of Parkinson's disease patients from interview data using pre-trained language models Aiichiro Hayatsu, Ryohei Sasano, Koichi Takeda (Nagoya Univ.) NLC2022-24 |
It is estimated that Parkinson’s disease (PD) affects 150,000 people in Japan, and the number of patients increases as s... [more] |
NLC2022-24 pp.28-31 |
NLC, IPSJ-NL, SP, IPSJ-SLP [detail] |
2022-12-01 15:20 |
Tokyo |
(Primary: On-site, Secondary: Online) |
Domain and language adaptation of large-scale pretrained model for speech recognition of low-resource language Kak Soky (Kyoto University), Sheng Li (NICT), Chenhui Chu, Tatsuya Kawahara (Kyoto University) NLC2022-17 SP2022-37 |
The self-supervised learning (SSL) models are effective for automatic speech recognition (ASR). Due to the huge paramete... [more] |
NLC2022-17 SP2022-37 pp.45-49 |
KBSE, SC |
2020-11-13 15:22 |
Online |
Online + Kikai-Shinko-Kaikan Bldg. (Primary: Online, Secondary: On-site) |
[Poster Presentation]
Automation of Ontology Generation by Pre-trained Language Model Atusi Oba, Ayato Kuwana, Paik Incheon (UoA) KBSE2020-22 SC2020-26 |
As an initial attempt of ontology generation with neural network, Recurrent Neural Network (RNN) based method is propose... [more] |
KBSE2020-22 SC2020-26 p.40 |
NLC, IPSJ-DC |
2019-09-28 16:00 |
Tokyo |
Future Corporation |
A comparison of Japanese pretrained BERT models Naoki Shibayama, Rui Cao, Jing Bai, Wen Ma, Hiroyuki Shinnou (Ibaraki Univ.) NLC2019-24 |
BERT is useful pre-training method for neural languages. There are pre-trained models for English which was used in a pa... [more] |
NLC2019-24 pp.89-92 |
|
|
|
Copyright and reproduction :
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
|
[Return to Top Page]
[Return to IEICE Web Page]
|