Online edition: ISSN 2432-6380
[TOP] | [2018] | [2019] | [2020] | [2021] | [2022] | [2023] | [2024] | [Japanese] / [English]
NLC2022-19
Analysis of Bacterial Flora in Plant Rhizosphere by Topic Model
Isana Makabe, Masayuki Yamamura (Tokyo Tech)
pp. 1 - 6
NLC2022-20
(See Japanese page.)
pp. 7 - 11
NLC2022-21
Variable Description Prediction Method Based on Nomenclature in Chemical Engineering Domain Papers
Shota Kato, Manabu Kano (Kyoto U.)
pp. 12 - 15
NLC2022-22
Estimating Named Entity Label Representation for Generative Low-Resource NER
Yuya Sawada (NAIST), Hiroki Teranishi (RIKEN AIP), Hiroki Ouchi (NAIST), Yuji Matsumoto (RIKEN AIP), Taro Watanabe (NAIST)
pp. 16 - 21
NLC2022-23
Creation of Question and Answer System Using Japanese Knowledge Graph and Investigation of the Graph Size Influence.
Kazuki Yano, Rafal Rzepka, Kenji Araki (Hokkaido Univ.)
pp. 22 - 27
NLC2022-24
Detection of Parkinson's disease patients from interview data using pre-trained language models
Aiichiro Hayatsu, Ryohei Sasano, Koichi Takeda (Nagoya Univ.)
pp. 28 - 31
NLC2022-25
Relationship between ESG performance and Narcissistic Text
Yuriko Nakao (Kansai Univ.), Aya Ishino (Hiroshima Economics Univ.), Kana Okada (Osaka Economics Univ.), Hitoshi Okada (Hiroshima Economics Univ.)
pp. 32 - 37
NLC2022-26
An Experimental Analysis of Sub-tasks for Multi-task Learning-based Text Classification
Yusuke Kimura (Doshisha Univ.), Takahiro Komamizu (Nagoya Univ.), Kenji Hatano (Doshisha Univ.)
pp. 38 - 43
NLC2022-27
A Study on the Acceptance and Usefulness of NLG in Tanka Poetry
Toru Urakawa, Takuro Niitsuma, Yuya Taguchi, Hideaki Tamori (Asahi Shimbun), Naoaki Okazaki (Tokyo Tech), Kentaro Inui (Tohoku Univ./RIKEN)
pp. 44 - 49
NLC2022-28
Contrastive Learning with Attention Pooling for Long Document Summarization
Tsukasa Kamo, Toru Sugimoto (SIT)
pp. 50 - 54
NLC2022-29
Collection of Textual Expressions in the Wild Toward Voice-quality Control from Free Description
Aya Watanabe, Shinnosuke Takamichi, Yuki Saito, Hiroshi Saruwatari (UTokyo)
pp. 55 - 60
Note: Each article is a technical report without peer review, and its polished version will be published elsewhere.