Multiple Kernel Spectral Regression for Dimensionality Reduction
Bing Liu,
Shixiong Xia and
Yong Zhou
Journal of Applied Mathematics, 2013, vol. 2013, issue 1
Abstract:
Traditional manifold learning algorithms, such as locally linear embedding, Isomap, and Laplacian eigenmap, only provide the embedding results of the training samples. To solve the out‐of‐sample extension problem, spectral regression (SR) solves the problem of learning an embedding function by establishing a regression framework, which can avoid eigen‐decomposition of dense matrices. Motivated by the effectiveness of SR, we incorporate multiple kernel learning (MKL) into SR for dimensionality reduction. The proposed approach (termed MKL‐SR) seeks an embedding function in the Reproducing Kernel Hilbert Space (RKHS) induced by the multiple base kernels. An MKL‐SR algorithm is proposed to improve the performance of kernel‐based SR (KSR) further. Furthermore, the proposed MKL‐SR algorithm can be performed in the supervised, unsupervised, and semi‐supervised situation. Experimental results on supervised classification and semi‐supervised classification demonstrate the effectiveness and efficiency of our algorithm.
Date: 2013
References: Add references at CitEc
Citations:
Downloads: (external link)
https://doi.org/10.1155/2013/427462
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wly:jnljam:v:2013:y:2013:i:1:n:427462
Access Statistics for this article
More articles in Journal of Applied Mathematics from John Wiley & Sons
Bibliographic data for series maintained by Wiley Content Delivery ().