EconPapers    
Economics at your fingertips  
 

Generalized Feature Embedding for Supervised, Unsupervised, and Online Learning Tasks

Eric Golinko () and Xingquan Zhu ()
Additional contact information
Eric Golinko: Florida Atlantic University
Xingquan Zhu: Florida Atlantic University

Information Systems Frontiers, 2019, vol. 21, issue 1, No 8, 125-142

Abstract: Abstract Feature embedding is an emerging research area which intends to transform features from the original space into a new space to support effective learning. Many feature embedding algorithms exist, but they often suffer from several major drawbacks, including (1) only handle single feature types, or users have to clearly separate features into different feature views and supply such information for feature embedding learning; (2) designed for either supervised or unsupervised learning tasks, but not for both; and (3) feature embedding for new out-of-training samples have to be obtained through a retraining phase, therefore unsuitable for online learning tasks. In this paper, we propose a generalized feature embedding algorithm, GEL, for both supervised, unsupervised, and online learning tasks. GEL learns feature embedding from any type of data or data with mixed feature types. For supervised learning tasks with class label information, GEL leverages a Class Partitioned Instance Representation (CPIR) process to arrange instances, based on their labels, as a dense binary representation via row and feature vectors for feature embedding learning. If class labels are unavailable, CPIR is naturally degenerated and treats all instances as one class. Based on the CPIR representation, GEL uses eigenvector decomposition to convert the proximity matrix into a low-dimensional space. For new out-of-training samples, their low-dimensional representation are derived through a direct conversion without a retraining phase. The learned numerical embedding features can be directly used to represent instances for effective learning. Experiments and comparisons on 28 datasets, including categorical, numerical, and ordinal features, demonstrate that embedding features learned from GEL can effectively represent the original instances for clustering, classification, and online learning.

Keywords: Representation learning; Feature embedding; Dimension reduction; Supervised learning; Clustering; Online learning (search for similar items in EconPapers)
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
http://link.springer.com/10.1007/s10796-018-9850-y Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:infosf:v:21:y:2019:i:1:d:10.1007_s10796-018-9850-y

Ordering information: This journal article can be ordered from
http://www.springer.com/journal/10796

DOI: 10.1007/s10796-018-9850-y

Access Statistics for this article

Information Systems Frontiers is currently edited by Ram Ramesh and Raghav Rao

More articles in Information Systems Frontiers from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:infosf:v:21:y:2019:i:1:d:10.1007_s10796-018-9850-y