Sparse Support Tensor Machine with Scaled Kernel Functions
Shuangyue Wang and
Ziyan Luo ()
Additional contact information
Shuangyue Wang: School of Mathematics and Statistics, Beijing Jiaotong University, Beijing 100044, China
Ziyan Luo: School of Mathematics and Statistics, Beijing Jiaotong University, Beijing 100044, China
Mathematics, 2023, vol. 11, issue 13, 1-20
Abstract:
As one of the supervised tensor learning methods, the support tensor machine (STM) for tensorial data classification is receiving increasing attention in machine learning and related applications, including remote sensing imaging, video processing, fault diagnosis, etc. Existing STM approaches lack consideration for support tensors in terms of data reduction. To address this deficiency, we built a novel sparse STM model to control the number of support tensors in the binary classification of tensorial data. The sparsity is imposed on the dual variables in the context of the feature space, which facilitates the nonlinear classification with kernel tricks, such as the widely used Gaussian RBF kernel. To alleviate the local risk associated with the constant width in the tensor Gaussian RBF kernel, we propose a two-stage classification approach; in the second stage, we advocate for a scaling strategy on the kernel function in a data-dependent way, using the information of the support tensors obtained from the first stage. The essential optimization models in both stages share the same type, which is non-convex and discontinuous, due to the sparsity constraint. To resolve the computational challenge, a subspace Newton method is tailored for the sparsity-constrained optimization for effective computation with local convergence. Numerical experiments were conducted on real datasets, and the numerical results demonstrate the effectiveness of our proposed two-stage sparse STM approach in terms of classification accuracy, compared with the state-of-the-art binary classification approaches.
Keywords: support tensor machine; sparsity; scaled kernel function; subspace Newton method; binary classification (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/11/13/2829/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/13/2829/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:13:p:2829-:d:1178056
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().