EconPapers    
Economics at your fingertips  
 

Dimension reduction using the generalized gradient direction

Junlong Zhao and Xiuli Zhao

Computational Statistics & Data Analysis, 2010, vol. 54, issue 4, 1089-1102

Abstract: Sufficient dimension reduction methods, such as the sliced inverse regression one (SIR) and the sliced average variance estimate one (SAVE), usually put restrictions on the regressor: X being elliptical or normal. We propose a new effective method, called the generalized gradient direction method (GGD), for solving sufficient dimension reduction problems. Compared with SIR, SAVE etc., GGD makes very weak assumptions on X and performs well with X being a continuous variable or a numerical discrete variable, while existing methods are all developed with X being a continuous variable. The computation for GGD is very simple, just like for SIR, SAVE etc. Moreover, GGD proves robust compared with many standard techniques. Simulation results in comparison with results from other methods support the advantages of GGD.

Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-9473(09)00400-9
Full text for ScienceDirect subscribers only.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:54:y:2010:i:4:p:1089-1102

Access Statistics for this article

Computational Statistics & Data Analysis is currently edited by S.P. Azen

More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:csdana:v:54:y:2010:i:4:p:1089-1102