The Library
End-to-end feature-aware label space encoding for multilabel classification with many classes
Tools
Lin, Zijia, Ding, Guiguang, Han, Jungong and Shao, Ling (2018) End-to-end feature-aware label space encoding for multilabel classification with many classes. IEEE Transactions on Neural Networks and Learning Systems, 29 (6). pp. 2472-2487. doi:10.1109/TNNLS.2017.2691545 ISSN 2162-237X.
Research output not available from this repository.
Request-a-Copy directly from author or use local Library Get it For Me service.
Official URL: http://dx.doi.org/10.1109/TNNLS.2017.2691545
Abstract
To make the problem of multilabel classification with many classes more tractable, in recent years, academia has seen efforts devoted to performing label space dimension reduction (LSDR). Specifically, LSDR encodes high-dimensional label vectors into low-dimensional code vectors lying in a latent space, so as to train predictive models at much lower costs. With respect to the prediction, it performs classification for any unseen instance by recovering a label vector from its predicted code vector via a decoding process. In this paper, we propose a novel method, namely End-to-End Feature-aware label space Encoding (E 2 FE), to perform LSDR. Instead of requiring an encoding function like most previous works, E 2 FE directly learns a code matrix formed by code vectors of the training instances in an end-to-end manner. Another distinct property of E 2 FE is its feature awareness attributable to the fact that the code matrix is learned by jointly maximizing the recoverability of the label space and the predictability of the latent space. Based on the learned code matrix, E 2 FE further trains predictive models to map instance features into code vectors, and also learns a linear decoding matrix for efficiently recovering the label vector of any unseen instance from its predicted code vector. Theoretical analyses show that both the code matrix and the linear decoding matrix in E 2 FE can be efficiently learned. Moreover, similar to previous works, E 2 FE can be specified to learn an encoding function. And it can also be extended with kernel tricks to handle nonlinear correlations between the feature space and the latent space. Comprehensive experiments conducted on diverse benchmark data sets with many classes show consistent performance gains of E 2 FE over the state-of-the-art methods.
Item Type: | Journal Article | ||||||
---|---|---|---|---|---|---|---|
Divisions: | Faculty of Science, Engineering and Medicine > Engineering > WMG (Formerly the Warwick Manufacturing Group) | ||||||
Journal or Publication Title: | IEEE Transactions on Neural Networks and Learning Systems | ||||||
Publisher: | IEEE | ||||||
ISSN: | 2162-237X | ||||||
Official Date: | June 2018 | ||||||
Dates: |
|
||||||
Volume: | 29 | ||||||
Number: | 6 | ||||||
Page Range: | pp. 2472-2487 | ||||||
DOI: | 10.1109/TNNLS.2017.2691545 | ||||||
Status: | Peer Reviewed | ||||||
Publication Status: | Published | ||||||
Access rights to Published version: | Restricted or Subscription Access |
Request changes or add full text files to a record
Repository staff actions (login required)
View Item |