講演抄録/キーワード |
講演名 |
2012-03-13 16:30
Feature Selection via L1-Penalized Squared-Loss Mutual Information ○Wittawat Jitkrittum・Hirotaka Hachiya・Masashi Sugiyama(Tokyo Inst. of Tech.) IBISML2011-107 |
抄録 |
(和) |
(まだ登録されていません) |
(英) |
Feature selection is a technique to screen out less important features.
Many existing supervised feature selection algorithms use redundancy and relevancy as the main criteria to select features. However, feature interaction, potentially a key characteristic in real-world problems, has not received much attention. As an attempt to take feature interaction into account, we propose $\ell_1$-LSMI, an $\ell_1$-regularization based algorithm that maximizes a squared-loss variant of mutual information
between selected features and outputs. Numerical results show that $\ell_1$-LSMI performs well in handling redundancy, detecting non-linear dependency, and considering feature interaction. |
キーワード |
(和) |
/ / / / / / / |
(英) |
feature selection / $\ell_1$-regularization / squared-loss mutual information / density-ratio estimation / dimensionality reduction / / / |
文献情報 |
信学技報, vol. 111, no. 480, IBISML2011-107, pp. 139-146, 2012年3月. |
資料番号 |
IBISML2011-107 |
発行日 |
2012-03-05 (IBISML) |
ISSN |
Print edition: ISSN 0913-5685 Online edition: ISSN 2432-6380 |
著作権に ついて |
技術研究報告に掲載された論文の著作権は電子情報通信学会に帰属します.(許諾番号:10GA0019/12GB0052/13GB0056/17GB0034/18GB0034) |
PDFダウンロード |
IBISML2011-107 |
|