講演抄録/キーワード |
講演名 |
2012-03-13 16:55
Squared-loss Mutual Information Regularization ○Gang Niu・Wittawat Jitkrittum・Hirotaka Hachiya(Tokyo Inst. of Tech.)・Bo Dai(Purdue Univ.)・Masashi Sugiyama(Tokyo Inst. of Tech.) IBISML2011-108 |
抄録 |
(和) |
The information maximization principle is a useful alternative to the low-density separation principle and prefers probabilistic classifiers that maximize the mutual information (MI) between data and labels. In this paper, we propose an approach for semi-supervised learning called squared-loss mutual information (SMI) regularization, which replaces MI with a novel information measure SMI. SMI regularization is the first framework that can offer all these four properties to algorithms: analytical solution, out-of-sample and multi-class classification, and probabilistic output. As an information-theoretic framework, it is directly related to a manifold regularization, and results in learning algorithms with data-dependent risk bounds. Experiments demonstrate that SMI regularization compares favorably with existing approaches of information-theoretic regularization. |
(英) |
The information maximization principle is a useful alternative to the low-density separation principle and prefers probabilistic classifiers that maximize the mutual information (MI) between data and labels. In this paper, we propose an approach for semi-supervised learning called squared-loss mutual information (SMI) regularization, which replaces MI with a novel information measure SMI. SMI regularization is the first framework that can offer all these four properties to algorithms: analytical solution, out-of-sample and multi-class classification, and probabilistic output. As an information-theoretic framework, it is directly related to a manifold regularization, and results in learning algorithms with data-dependent risk bounds. Experiments demonstrate that SMI regularization compares favorably with existing approaches of information-theoretic regularization. |
キーワード |
(和) |
semi-supervised learning / information-theoretic learning / squared-loss mutual information / / / / / |
(英) |
semi-supervised learning / information-theoretic learning / squared-loss mutual information / / / / / |
文献情報 |
信学技報, vol. 111, no. 480, IBISML2011-108, pp. 147-153, 2012年3月. |
資料番号 |
IBISML2011-108 |
発行日 |
2012-03-05 (IBISML) |
ISSN |
Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380 |
著作権に ついて |
技術研究報告に掲載された論文の著作権は電子情報通信学会に帰属します.(許諾番号:10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
PDFダウンロード |
IBISML2011-108 |