EconPapers    
Economics at your fingertips  
 

An Efficient Sparse Twin Parametric Insensitive Support Vector Regression Model

Shuanghong Qu (), Yushan Guo, Renato De Leone (), Min Huang and Pu Li
Additional contact information
Shuanghong Qu: College of Mathematics and Information Science, Zhengzhou University of Light Industry, Zhengzhou 450002, China
Yushan Guo: School of Science and Technology, University of Camerino, 62032 Camerino, Italy
Renato De Leone: School of Science and Technology, University of Camerino, 62032 Camerino, Italy
Min Huang: College of Software Engineering, Zhengzhou University of Light Industry, Zhengzhou 450001, China
Pu Li: College of Software Engineering, Zhengzhou University of Light Industry, Zhengzhou 450001, China

Mathematics, 2025, vol. 13, issue 13, 1-29

Abstract: This paper proposes a novel sparse twin parametric insensitive support vector regression (STPISVR) model, designed to enhance sparsity and improve generalization performance. Similar to twin parametric insensitive support vector regression (TPISVR), STPISVR constructs a pair of nonparallel parametric insensitive bound functions to indirectly determine the regression function. The optimization problems are reformulated as two sparse linear programming problems (LPPs), rather than traditional quadratic programming problems (QPPs). The two LPPs are originally derived from initial L1-norm regularization terms imposed on their respective dual variables, which are simplified to constants via the Karush–Kuhn–Tucker (KKT) conditions and consequently disappear. This simplification reduces model complexity, while the constraints constructed through the KKT conditions— particularly their geometric properties—effectively ensure sparsity. Moreover, a two-stage hybrid tuning strategy—combining grid search for coarse parameter space exploration and Bayesian optimization for fine-grained convergence—is proposed to precisely select the optimal parameters, reducing tuning time and improving accuracy compared to a singlemethod strategy. Experimental results on synthetic and benchmark datasets demonstrate that STPISVR significantly reduces the number of support vectors (SVs), thereby improving prediction speed and achieving a favorable trade-off among prediction accuracy, sparsity, and computational efficiency. Overall, STPISVR enhances generalization ability, promotes sparsity, and improves prediction efficiency, making it a competitive tool for regression tasks, especially in handling complex data structures.

Keywords: sparse twin parametric insensitive support vector regression (STPISVR); sparsity; prediction speed; generalization performance; linear programming problem (LPP); Karush–Kuhn–Tucker (KKT) conditions (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/13/2206/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/13/2206/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:13:p:2206-:d:1695932

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-07-08
Handle: RePEc:gam:jmathe:v:13:y:2025:i:13:p:2206-:d:1695932