IEICE Technical Committee Submission System
Conference Schedule
Online Proceedings
[Sign in]
Tech. Rep. Archives
    [Japanese] / [English] 
( Committee/Place/Topics  ) --Press->
 
( Paper Keywords:  /  Column:Title Auth. Affi. Abst. Keyword ) --Press->

All Technical Committee Conferences  (Searched in: All Years)

Search Results: Conference Papers
 Conference Papers (Available on Advance Programs)  (Sort by: Date Descending)
 Results 1 - 20 of 29  /  [Next]  
Committee Date Time Place Paper Title / Authors Abstract Paper #
PRMU, IPSJ-CVIM 2020-03-17
16:50
Kyoto
(Cancelled but technical report was issued)
Experimental Evaluation for Bayes Error Estimation Capability of Large Geometric Margin Minimum Classification Error Training
Ikuhiro Nishiyama (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.) PRMU2019-99
Previous studies suggested that the Large Geometric Margin-Minimum Classification Error (LGM-MCE) training method had th... [more] PRMU2019-99
pp.231-236
PRMU, IPSJ-CVIM 2020-03-17
17:05
Kyoto
(Cancelled but technical report was issued)
Experimental Evaluation on Bayes Error Estimation Capability of Kernel Minimum Classification Error Training
Koji Yamada (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.) PRMU2019-100
A pattern classifier incorporating kernel mapping, which is trained by the Kernel Minimum Classification Error (KMCE) tr... [more] PRMU2019-100
pp.237-242
PRMU, IPSJ-CVIM 2020-03-17
17:20
Kyoto
(Cancelled but technical report was issued)
Study on Maximum Bayes Boundary-ness Training for Pattern Classification
Masahiro Senda, David Ha (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.) PRMU2019-101
 [more] PRMU2019-101
pp.243-248
PRMU 2018-12-14
15:50
Miyagi   Bayes Boundary Estimation Capability Assessment for Large Geometric Margin Minimum Classification Error Training
Ikuhiro Nishiyama (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Osaki (Doshisha Univ.) PRMU2018-92
The recent, Large Geometric Margin Minimum Classification Error training has, based on the smoothness of its smooth clas... [more] PRMU2018-92
pp.91-96
PRMU 2018-12-14
16:05
Miyagi   Experimental Evaluation of Automatic Determination of Loss Smoothness for Minimum Classification Error Training
Kazuma Kobayashi (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.) PRMU2018-93
(To be available after the conference date) [more] PRMU2018-93
pp.97-102
PRMU, CNR 2018-02-20
11:45
Wakayama   A Classification-Uncertainty-Based Criterion for Classification Boundary Selection
David Ha (Doshisha Univ.), Juliette Maes (ECL), Yuya Tomotoshi (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Osaki (Doshisha Univ.) PRMU2017-166 CNR2017-44
 [more] PRMU2017-166 CNR2017-44
pp.121-126
PRMU, CNR 2017-02-18
11:20
Hokkaido   Small-sized Kernel Classifier By Support Vector Retraining Based on Minimum Classification Error Criterion
Ryoma Tani (Doshisha Univ.), Hideyuki Watanabe (ATR), Shigeru Katagiri, Miho Osaki (Doshisha Univ.) PRMU2016-159 CNR2016-26
Different from the Multi-class Support Vector Machine (MSVM) that fixes Support Vectors (SVs) to training samples, the K... [more] PRMU2016-159 CNR2016-26
pp.41-46
PRMU 2015-12-21
09:30
Nagano   Evaluation of Automatic Prototype-Model Size Optimization in Large Geometric Margin Minimum Classification Error Training
Masahiro Ogino (Doshisha Univ.), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Osaki (Doshisha Univ.), Xugang Lu, Hisashi Kawai (NICT) PRMU2015-100
To develop a method for nding an appropriate class model size, which leads to accurate classi cation over unseen patter... [more] PRMU2015-100
pp.1-6
SP, IPSJ-SLP
(Joint)
2015-07-16
17:20
Nagano Katakura Suwako Hotel Experimental evaluation of network size effect in speaker adaptive trained DNNs embedding linear transformation networks
Tsubasa Ochiai (Doshisha Univ./NICT), Shigeki Matsuda (Doshisha Univ.), Hideyuki Watanabe, Xugang Lu, Hisashi Kawai (NICT), Shigeru Katagiri (Doshisha Univ.) SP2015-41
Recently we proposed a novel speaker adaptation method that applied the Speaker Adaptive Training
(SAT) concept to DNN-... [more]
SP2015-41
pp.31-36
PRMU, IPSJ-CVIM, MVE [detail] 2015-01-23
09:50
Nara   Analysis of Minimum Classification Error Training using Bit-String-Based Genetic Algorithms
Hiroto Togoe (Doshisha Univ.), Hideyuki Watanabe (NICT), Shigeru Katagiri (Doshisha Univ.), Xugang Lu, Chiori Hori (NICT), Miho Ohsaki (Doshisha Univ.) PRMU2014-100 MVE2014-62
Minimum Classification Error (MCE) training using gradient-descent-based loss minimization does not guarantee a global m... [more] PRMU2014-100 MVE2014-62
pp.171-176
PRMU, IPSJ-CVIM, MVE [detail] 2015-01-23
10:15
Nara   Relation between Data Grouping and Robustness to Unseen Data in Large Geometric Margin Minimum Classification Error Training
Hiroyuki Shiraishi (Doshisha Univ), Hideyuki Watanabe (NICT), Shigeru Katagiri (Doshisha Univ), Xugang Lu, Chiori Hori (NICT), Miho Ohsaki (Doshisha Univ) PRMU2014-101 MVE2014-63
To develop a pattern classifier that is robust to unseen pattern samples, classifier parameters have been conventionally... [more] PRMU2014-101 MVE2014-63
pp.177-182
PRMU 2014-03-14
15:30
Tokyo   Experimental study on effect of pre-training in deep learning through visualization of unit outputs
Tsubasa Ochiai (Doshisha Univ./NICT), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.), Shigeki Matsuda, Chiori Hori (NICT) PRMU2013-210
To clarify the capability of recent powerful classifier concept, Deep Neural Networks (DNN), we experimentally
investig... [more]
PRMU2013-210
pp.253-258
PRMU, IPSJ-CVIM, MVE [detail] 2014-01-23
09:30
Osaka   Minimum Classification Error Training with Automatic Determination of Loss Smoothness Common to All Classes
Kensuke Ota (Doshisha Univ.), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.), Shigeki Matsuda, Chiori Hori (NICT) PRMU2013-91 MVE2013-32
The smoothness of the smooth classification error count loss used in the Minimum Classification Error (MCE) training has... [more] PRMU2013-91 MVE2013-32
pp.1-6
PRMU, IPSJ-CVIM, MVE [detail] 2014-01-23
10:00
Osaka   Minimum Classification Error Training with Automatic Control of Loss Smoothness
Hideaki Tanaka (Doshisha Univ.), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.), Shigeki Matsuda, Chiori Hori (NICT) PRMU2013-92 MVE2013-33
The Minimum Classification Error (MCE) training has been successfully applied to various types of classifiers. However, ... [more] PRMU2013-92 MVE2013-33
pp.7-12
PRMU, IPSJ-CVIM, MVE [detail] 2014-01-23
10:30
Osaka   Multi-Class Support Vector Machine based on Minimum Classification Error Criterion
Hisashi Uehara (Doshisha Univ.), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.), Shigeki Matsuda, Chiori Hori (NICT) PRMU2013-93 MVE2013-34
Gradient-descent-based optimization methods used in Minimum Classification Error (MCE) training are not necessarily easi... [more] PRMU2013-93 MVE2013-34
pp.13-18
PRMU, IPSJ-CVIM, MVE [detail] 2014-01-23
11:00
Osaka   Large Geometric Margin Minimum Classification Error Training with Automatic Optimization Of The Number of Prototypes
Yuji Takayama (Doshisha Univ.), Hideyuki Watanabe (NICT), Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.), Shigeki Matsuda, Chiori Hori (NICT) PRMU2013-94 MVE2013-35
Large Geometric Margin Minimum Classification Error (LGM-MCE) training, which adopts geometric-margin-based misclassific... [more] PRMU2013-94 MVE2013-35
pp.19-24
LQE, LSJ 2013-05-17
15:20
Ishikawa   Optical and spin properties of nitrogen-vacancy centers in diamond fabricated using nitrogen-doped isotopically-enriched chemical vapor deposition
Tomohiro Gomi, Shuhei Tomizawa (Keio Univ.), Hideyuki Watanabe, Hitoshi Umezawa, Shin-ichi Shikata (AIST), Kohei M. Itoh, Junko Ishi-Hayase (Keio Univ.) LQE2013-12
An electronic spin state of nitrogen-vacancy (NV) centers in diamond is expected as an promising candidate for quantum i... [more] LQE2013-12
pp.53-56
PRMU 2011-03-11
09:30
Ibaraki   Application of Automatic Loss Smoothness Control to Large Geometric Margin Minimum Classification Error Training
Tsukasa Ohashi (Doshisha Univ.), Hideyuki Watanabe (NICT), Jun'ichi Tokuno, Shigeru Katagiri, Miho Ohsaki (Doshisha Univ.) PRMU2010-270
A method that automatically controls the smoothness of a smoothed classification error count loss using Parzen estimatio... [more] PRMU2010-270
pp.195-200
EMD 2011-03-04
14:30
Saitama Nippon Institute of Technology Comparison of hauling force between series & parallel connection of electric motor by readhesion control in 5-inches large model electric locomotive
Hideyuki Watanabe, Hidenori Ito, Daichi Kawarahata, Naohiro Saito, Yu Yifan, Kiyoshi Onodera, Lu Zijun, Makoto Takanezawa, Noboru Morita (Nippon Inst. of Tech.) EMD2010-160
This research is in association with improvement of hauling force for electric rolling stock.
In recent years, as a res... [more]
EMD2010-160
pp.33-36
PRMU, FM 2010-12-09
16:40
Yamaguchi   Large Geometric Margin Minimum Classification Error Training for Kernel-based High Dimensional Space
Hideyuki Watanabe (NICT), Shigeru Katagiri, Mamoru Adachi, Miho Ohsaki (Doshisha Univ.) PRMU2010-136
Large Geometric Margin Minimum Classification Error (LGM-MCE) training has been successfully applied to multi-class clas... [more] PRMU2010-136
pp.55-60
 Results 1 - 20 of 29  /  [Next]  
Choose a download format for default settings. [NEW !!]
Text format pLaTeX format CSV format BibTeX format
Copyright and reproduction : All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)


[Return to Top Page]

[Return to IEICE Web Page]


The Institute of Electronics, Information and Communication Engineers (IEICE), Japan