Learning Neural Representations and Local Embedding for Nonlinear Dimensionality Reduction Mapping
Sheng-Shiung Wu,
Sing-Jie Jong,
Kai Hu and
Jiann-Ming Wu
Additional contact information
Sheng-Shiung Wu: Department of Applied Mathematics, National Dong Hwa University, Hualien 947301, Taiwan
Sing-Jie Jong: Department of Applied Mathematics, National Dong Hwa University, Hualien 947301, Taiwan
Kai Hu: Department of Applied Mathematics, National Dong Hwa University, Hualien 947301, Taiwan
Jiann-Ming Wu: Department of Applied Mathematics, National Dong Hwa University, Hualien 947301, Taiwan
Mathematics, 2021, vol. 9, issue 9, 1-18
Abstract:
This work explores neural approximation for nonlinear dimensionality reduction mapping based on internal representations of graph-organized regular data supports. Given training observations are assumed as a sample from a high-dimensional space with an embedding low-dimensional manifold. An approximating function consisting of adaptable built-in parameters is optimized subject to given training observations by the proposed learning process, and verified for transformation of novel testing observations to images in the low-dimensional output space. Optimized internal representations sketch graph-organized supports of distributed data clusters and their representative images in the output space. On the basis, the approximating function is able to operate for testing without reserving original massive training observations. The neural approximating model contains multiple modules. Each activates a non-zero output for mapping in response to an input inside its correspondent local support. Graph-organized data supports have lateral interconnections for representing neighboring relations, inferring the minimal path between centroids of any two data supports, and proposing distance constraints for mapping all centroids to images in the output space. Following the distance-preserving principle, this work proposes Levenberg-Marquardt learning for optimizing images of centroids in the output space subject to given distance constraints, and further develops local embedding constraints for mapping during execution phase. Numerical simulations show the proposed neural approximation effective and reliable for nonlinear dimensionality reduction mapping.
Keywords: unsupervised learning; distance preserving mapping; nonlinear dimensionality reduction mapping; data visualization; topology preservation; data support approximation; nonlinear system solving; Levenberg-Marquardt learning; clustering analysis; principle component analysis; locally nonlinear embedding (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/9/9/1017/pdf (application/pdf)
https://www.mdpi.com/2227-7390/9/9/1017/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:9:y:2021:i:9:p:1017-:d:546884
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().