EconPapers    
Economics at your fingertips  
 

Generalized Penalized Constrained Regression: Sharp Guarantees in High Dimensions with Noisy Features

Ayed M. Alrashdi (), Meshari Alazmi and Masad A. Alrasheedi
Additional contact information
Ayed M. Alrashdi: Department of Electrical Engineering, College of Engineering, University of Ha’il, Ha’il 81441, Saudi Arabia
Meshari Alazmi: Department of Information and Computer Science, College of Computer Science and Engineering, University of Ha’il, Ha’il 81411, Saudi Arabia
Masad A. Alrasheedi: Department of Management Information Systems, College of Business Administration, Taibah University, Madinah 42353, Saudi Arabia

Mathematics, 2023, vol. 11, issue 17, 1-27

Abstract: The generalized penalized constrained regression (G-PCR) is a penalized model for high-dimensional linear inverse problems with structured features. This paper presents a sharp error performance analysis of the G-PCR in the over-parameterized high-dimensional setting. The analysis is carried out under the assumption of a noisy or erroneous Gaussian features matrix. To assess the performance of the G-PCR problem, the study employs multiple metrics such as prediction risk, cosine similarity, and the probabilities of misdetection and false alarm. These metrics offer valuable insights into the accuracy and reliability of the G-PCR model under different circumstances. Furthermore, the derived results are specialized and applied to well-known instances of G-PCR, including l 1 -norm penalized regression for sparse signal recovery and l 2 -norm (ridge) penalization. These specific instances are widely utilized in regression analysis for purposes such as feature selection and model regularization. To validate the obtained results, the paper provides numerical simulations conducted on both real-world and synthetic datasets. Using extensive simulations, we show the universality and robustness of the results of this work to the assumed Gaussian distribution of the features matrix. We empirically investigate the so-called double descent phenomenon and show how optimal selection of the hyper-parameters of the G-PCR can help mitigate this phenomenon. The derived expressions and insights from this study can be utilized to optimally select the hyper-parameters of the G-PCR. By leveraging these findings, one can make well-informed decisions regarding the configuration and fine-tuning of the G-PCR model, taking into consideration the specific problem at hand as well as the presence of noisy features in the high-dimensional setting.

Keywords: penalized regression; prediction risk; cosine similarity; probability of false alarm; double descent; over-parameterization; constrained ridge regression (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/11/17/3706/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/17/3706/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:17:p:3706-:d:1227466

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:11:y:2023:i:17:p:3706-:d:1227466