Paper Abstract and Keywords |
Presentation |
2009-03-13 15:20
A Proposal of Ensemble-based Minimum Classification Error Training Hideyuki Watanabe (NICT/ATR), Shigeru Katagiri, Kohta Yamada (Doshisha Univ.), Atsushi Nakamura, Erik McDermott, Shinji Watanabe (NTT), Shin'ichi Taniguchi, Naho Nishijima, Miho Ohsaki (Doshisha Univ.) PRMU2008-250 |
Abstract |
(in Japanese) |
(See Japanese page) |
(in English) |
We propose an ensemble-based minimum classification error (MCE) training method to combine multiple weak classifiers in a manner consistent with the ultimate standard, Bayes error estimation. First, we discuss boosting, a key methodology of ensemble training, from the viewpoints of mathematical optimality for loss minimization and its relationship to the Bayes error estimation. We also review the basic concept of MCE training,
and elucidate the relationship between boosting and MCE by analyzing their loss minimization procedures. We then propose an ensemble-based training method named Ensemble-based MCE, which in principle leads to the Bayes error condition for a general multi-class task. |
Keyword |
(in Japanese) |
(See Japanese page) |
(in English) |
ensemble / minimum classification error / MCE / boosting / Bayes error / / / |
Reference Info. |
IEICE Tech. Rep., vol. 108, no. 484, PRMU2008-250, pp. 71-76, March 2009. |
Paper # |
PRMU2008-250 |
Date of Issue |
2009-03-06 (PRMU) |
ISSN |
Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380 |
Copyright and reproduction |
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
Download PDF |
PRMU2008-250 |
|