Committee |
Date Time |
Place |
Paper Title / Authors |
Abstract |
Paper # |
SIP, MI, IE |
2019-05-24 09:00 |
Aichi |
|
Hierarchical Personal Identification in Short and Long Distance using Blink Motion Features Mariko Nakano, Ritsuko Tokunaga, Yoko Uchida, Daisuke Sugimura (Tsuda Univ.) SIP2019-7 IE2019-7 MI2019-7 |
We propose a method for person identification using blink motion features. Previous methods are difficult to achieve a p... [more] |
SIP2019-7 IE2019-7 MI2019-7 pp.29-32 |
IMQ, IE, MVE, CQ (Joint) [detail] |
2019-03-14 10:50 |
Kagoshima |
Kagoshima University |
Estimation of Video Viewers' Emotion by SRC using Bio-signals and Facial Feature Points Yui Tagami, Mutsumi Suganuma, Wataru Kameyama (Waseda Univ.), Simon Clippingdale (NHK STRL) CQ2018-95 |
Aiming for making more accurate content recommendation system, we estimate video viewers’ emotion based on questionnaire... [more] |
CQ2018-95 pp.19-24 |
CS |
2018-11-01 11:00 |
Ehime |
The Shiki Museum |
Emotion Extraction from Face Images and its Quantification/Visualization
-- Experimental Evaluation in Music Concerts and Rakugo Performance -- Seima Todo, Tetsuo Tsujioka, Kazunobu Okazaki, Akihiro Odanaka (Osaka City Univ.) CS2018-63 |
Nowadays, machine learning, especially a deep learning technique for image classification, has been one of attractive re... [more] |
CS2018-63 pp.43-49 |
NLC, IPSJ-DC |
2018-09-06 10:10 |
Tokyo |
Seikei University |
An Evaluation Method for Estimating the Degree of Difficulty to Extract Writer's Emotion based on Response Time in Annotating Emotion Sanae Yamashita, Yasushi Kami (NIT, Akashi College), Eri Kato, Takeshi Sakai, Noriyuki Okumura (Otemae Univ.) NLC2018-9 |
This research examines the degree of difficulty in estimating emotions in Japanese short sentences based on the response... [more] |
NLC2018-9 pp.1-6 |
RCC, MICT |
2018-05-24 13:00 |
Tokyo |
Tokyo Big Sight |
[Poster Presentation]
Investigation of Feature Reduction for Body Motion Identification using Radio Channel Characteristics in Wireless BAN Yuki Ichikawa, Minseok Kim (Niigata Univ.) RCC2018-4 MICT2018-4 |
In this article, the reduction of the features used for human motion classification using decision tree machine learning... [more] |
RCC2018-4 MICT2018-4 pp.17-20 |
HCGSYMPO (2nd) |
2017-12-13 - 2017-12-15 |
Ishikawa |
THE KANAZAWA THEATRE |
A Consideration on Mapping between User Emotion Classification by Bio-signals and Questionnaire Response while Watching Video Ying Fu, Mutsumi Suganuma, Wataru Kameyama (Waseda Univ.), Simon Clippingdale (NHK STRL) |
In order to find a method of estimating the video viewers’ emotion, we analyze video viewers’ emotions using pupil size,... [more] |
|
WIT, HI-SIGACI |
2017-12-07 14:25 |
Tokyo |
AIST Tokyo Waterfront |
A study on an indexing method of continuous JSL fingerspelling with moving action alphabets using motion features masaya kato, Masashi morimoto (AIT) WIT2017-60 |
The promotion of sign language is important for the communication with hearing-impaired people.Fingerspelling is one of ... [more] |
WIT2017-60 pp.115-120 |
HIP |
2017-10-23 16:30 |
Kyoto |
Kyoto Terrsa |
Classification of EEG and NIRS signals induced by affective pictures Shingo Ryu, Hiroshi Higashi (TUT), Junya Muramatsu (TMC), Shigeki Nakachi, Tetsuto Minami (TUT) HIP2017-64 |
Brain activity can be induced by emotional stimuli. We classified brain activities induced by affective pictures from IA... [more] |
HIP2017-64 pp.39-42 |
MBE |
2017-09-23 13:25 |
Nagano |
National Institute of Technology, Nagano College |
Detecting Emotional Suppression in the Presence of Disgust by Time Series Change of Cerebral Blood Flow using fNIRS Masahiro Honda, Hiroki Tanaka, Sakriani Sakti, Satoshi Nakamura (NAIST) MBE2017-35 |
A form of emotional suppression is defined as the conscious inhibition of emotional-expressive behaviors while emotional... [more] |
MBE2017-35 pp.5-10 |
CQ |
2017-08-29 12:10 |
Tokyo |
Tokyo University of Science |
Transmission Power Control Using Human Motion Classification in Daily Human Motion Scenarios for WBAN Sukhumarn Archasantisuk, Takahiro Aoyagi (Tokyo Tech) CQ2017-64 |
In this paper, transmission power control using human motion classification for wireless body area network (WBAN) is dev... [more] |
CQ2017-64 pp.75-80 |
MBE, NC (Joint) |
2017-05-26 14:15 |
Toyama |
Toyama Prefectural Univ. |
Frequency Filter Networks on EEG Data for Emotion Analysis Miku Yanagimoto, Chika Sugimoto, Tomoharu Nagao (YNU) NC2017-4 |
In EEG-based emotion recognition (EEG-ER), enhancing feature extractors is often difficult.
In such cases, the use of ... [more] |
NC2017-4 pp.19-24 |
HCS, HIP, HI-SIGCOASTER [detail] |
2017-05-16 15:20 |
Okinawa |
Okinawa Industry Support Center |
Indexing of riders’ emotion on motorcycle based on core-affect model
-- Classification of riders through emotional reaction evaluation toward pictures -- Masashi Sugimoto, Shota Imai, Kenji Katahira, Yoichi Yamazaki, Noriko Nagata (Kwansei Gakuin Univ.), Ayako Masuda, Kobue Iwata, Hajime Uchiyama (Honda R& D Co.,Ltd) HCS2017-16 HIP2017-16 |
The present research categorized riders through emotional reaction evaluation toward motorcycle pictures. Based on each ... [more] |
HCS2017-16 HIP2017-16 pp.123-126 |
IE, ITS, ITE-AIT, ITE-HI, ITE-ME, ITE-MMS, ITE-CE [detail] |
2017-02-20 15:30 |
Hokkaido |
Hokkaido Univ. |
A note on estimation of users' emotion evoked during listening to music
-- Performance improvement base on fusion of multiple estimation results -- Boxiao Duan, Takahiro Ogawa, Miki Haseyama (Hokkaido Univ.) |
This paper precent estimation of users' emotion evoked during listening to music. In our method, we focus on RUSBagging ... [more] |
|
IN |
2017-01-20 10:20 |
Aichi |
|
Spatio-Temporal Emotion Estimation for Automatic Map Generation of Emotion Distributions Satoru Watanabe, Komei Arasawa, Motoki Eida, Syun Hattori (Muroran Inst. of Tech.) IN2016-94 |
A certain place gives an effect on its visitor's emotion. For instance, a person who is tired of work climbs Mt. Fuji an... [more] |
IN2016-94 pp.55-60 |
ASN, MoNA, MICT (Joint) |
2017-01-20 14:15 |
Oita |
|
An Investigation of Body Motion Identification Method using Radio Channel Characteristics for BAN Context-Aware Communications Yuki Ichikawa, Minseok Kim (Niigata Univ.) MICT2016-74 |
In this article, human motion classification to realize context-aware BAN has been empirically investigated. We develope... [more] |
MICT2016-74 pp.53-56 |
HCGSYMPO (2nd) |
2016-12-07 - 2016-12-09 |
Kochi |
Kochi City Culture Plaza (CUL-PORT) |
Emotion category mapping to emotional space by cross-corpus labelling
-- Psychological and acoustical examination of emotion perception -- Yoshiko Arimoto (SIT), Mori Hiroki (Utsunomiya Univ.) |
Psychological classification of emotion has two main viewpoints. One is emotion category that emotion is classified into... [more] |
|
PRMU, BioX |
2016-03-24 10:00 |
Tokyo |
|
Temporal Spotting of Human Actions in Video Based on Voting Framework with a Hierarchical Action Model Keita Hara, Kazuaki Nakamura, Noboru Babaguchi (Osaka Univ.) BioX2015-42 PRMU2015-165 |
This paper focuses on the task of temporal action spotting: temporal segmentation and classification of human actions in... [more] |
BioX2015-42 PRMU2015-165 pp.7-12 |
MICT, ASN, MoNA (Joint) |
2016-01-28 16:05 |
Kanagawa |
Hotel Okada |
[Poster Presentation]
Motion Training Support Method Based on Functional Electrical Stimulation and Motion Estimation Hiroaki Hanai, Keisuke Shima (YNU), Koji Shimatani (Prefectural Univ. of Hiroshima) MICT2015-45 |
In a previous paper, the authors outlined a technique for motion communication involving functional electrical stimulati... [more] |
MICT2015-45 pp.27-29 |
HIP |
2015-07-19 11:20 |
Fukuoka |
Kyushu Sangyo University |
The classification of paintings by color statistics Masatoshi Kitaguchi, Masahiro Wakabayashi, Hiromichi Sato, Tomoyuki Naito (Osaka Univ.) HIP2015-60 |
Several studies reported that statistics of visual properties of paintings might be distinctive among painters, and that... [more] |
HIP2015-60 pp.93-98 |
HCGSYMPO (2nd) |
2014-12-17 - 2014-12-19 |
Yamaguchi |
Kaikyo Messe Shimonoseki |
Discrimination of facial expression using 3D trajectory of sparsely extracted feature points on the face. Kaori Iwasa, Syunsuke Nagata, Yoshinori Inaba, Shigeru Akamatsu (Hosei Univ.) |
We defined feature vectors that represent the 3D coordinates of the marker points attached on human face, which were mea... [more] |
|