EconPapers    
Economics at your fingertips  
 

Fast face recognition based on fractal theory

Zhijie Tang, Xiaocheng Wu, Bin Fu, Weiwei Chen and Hao Feng

Applied Mathematics and Computation, 2018, vol. 321, issue C, 721-730

Abstract: Nowadays, people are more and more concerned about accuracy, rapidity and convenience in the process of personal identification. In the field of biology and computer vision, a variety of methods have been proposed, while a proper method for face recognition is still a challenge. Although some reliable systems and advanced methods have been introduced under relatively controlled conditions, their recognition rate or speed is not satisfactory in the general settings. This is especially true when there are variations in pose, illumination, and facial expression. This paper proposed a fast face recognition method based on fractal theory. This method is to compress the facial images to obtain fractal codes and complete face recognition with these codes. Experimental results on Yale, FERET and CMU PIE databases demonstrate the high efficiency of our method in runtime and correct rate.

Keywords: Face recognition; Fractal theory; Fractal code (search for similar items in EconPapers)
Date: 2018
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0096300317307993
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:apmaco:v:321:y:2018:i:c:p:721-730

DOI: 10.1016/j.amc.2017.11.017

Access Statistics for this article

Applied Mathematics and Computation is currently edited by Theodore Simos

More articles in Applied Mathematics and Computation from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:apmaco:v:321:y:2018:i:c:p:721-730