EconPapers    
Economics at your fingertips  
 

Generalized kernel-based inverse regression methods for sufficient dimension reduction

Chuanlong Xie and Lixing Zhu

Computational Statistics & Data Analysis, 2020, vol. 150, issue C

Abstract: The linearity condition and the constant conditional variance assumption popularly used in sufficient dimension reduction are respectively close to elliptical symmetry and normality. However, it is always the concern about their restrictiveness. In this article, we give systematic studies to provide insight into the reasons why the popularly used sliced inverse regression and sliced average variance estimation need these conditions. Then we propose a new framework to relax these conditions and suggest generalized kernel-based inverse regression methods to handle a class of mixture multivariate unified skew-elliptical distributions.

Keywords: Unified skew-elliptical distribution; Stein’s Lemma; Sufficient dimension reduction (search for similar items in EconPapers)
Date: 2020
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167947320300864
Full text for ScienceDirect subscribers only.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:csdana:v:150:y:2020:i:c:s0167947320300864

DOI: 10.1016/j.csda.2020.106995

Access Statistics for this article

Computational Statistics & Data Analysis is currently edited by S.P. Azen

More articles in Computational Statistics & Data Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:csdana:v:150:y:2020:i:c:s0167947320300864