EconPapers    
Economics at your fingertips  
 

LiteGaze: Neural architecture search for efficient gaze estimation

Xinwei Guo, Yong Wu, Jingjing Miao and Yang Chen

PLOS ONE, 2023, vol. 18, issue 5, 1-11

Abstract: Gaze estimation plays a critical role in human-centered vision applications such as human–computer interaction and virtual reality. Although significant progress has been made in automatic gaze estimation by deep convolutional neural networks, it is still difficult to directly deploy deep learning based gaze estimation models across different edge devices, due to the high computational cost and various resource constraints. This work proposes LiteGaze, a deep learning framework to learn architectures for efficient gaze estimation via neural architecture search (NAS). Inspired by the once-for-all model (Cai et al., 2020), this work decouples the model training and architecture search into two different stages. In particular, a supernet is trained to support diverse architectural settings. Then specialized sub-networks are selected from the obtained supernet, given different efficiency constraints. Extensive experiments are performed on two gaze estimation datasets and demonstrate the superiority of the proposed method over previous works, advancing the real-time gaze estimation on edge devices.

Date: 2023
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0284814 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 84814&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0284814

DOI: 10.1371/journal.pone.0284814

Access Statistics for this article

More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().

 
Page updated 2025-05-03
Handle: RePEc:plo:pone00:0284814