EconPapers    
Economics at your fingertips  
 

A Survey for Sparse Regularization Based Compression Methods

Anda Tang (), Pei Quan (), Lingfeng Niu () and Yong Shi ()
Additional contact information
Anda Tang: University of Chinese Academy of Sciences
Pei Quan: University of Chinese Academy of Sciences
Lingfeng Niu: University of Chinese Academy of Sciences
Yong Shi: Chinese Academy of Sciences

Annals of Data Science, 2022, vol. 9, issue 4, No 3, 695-722

Abstract: Abstract In recent years, deep neural networks (DNNs) have attracted extensive attention due to their excellent performance in many fields of vision and speech recognition. With the increasing scale of tasks to be solved, the network used is becoming wider and deeper, which requires millions or even billions of parameters. The deep and wide network with many parameters brings the problems of memory requirement, computing overhead and over fitting, which seriously hinder the application of DNNs in practice. Therefore, a natural idea is to train sparse networks and floating-point operators with fewer parameters while maintaining considerable performance. In the past few years, people have done a lot of research in the field of neural network compression, including sparse-inducing methods, quantization, knowledge distillation and so on. And the sparse-inducing methods can be roughly divided into pruning, dropout and sparse regularization based optimization. In this paper, we briefly review and analyze the sparse regularization optimization methods. For the model and optimization method of sparse regularization based compression, we discuss both the different advantages and disadvantages. Finally, we provide some insights and discussions on how to make sparse regularization fit within the compression framework.

Keywords: Deep neural networks; Sparsity learning; Compression (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s40745-022-00389-6 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:aodasc:v:9:y:2022:i:4:d:10.1007_s40745-022-00389-6

Ordering information: This journal article can be ordered from
https://www.springer ... gement/journal/40745

DOI: 10.1007/s40745-022-00389-6

Access Statistics for this article

Annals of Data Science is currently edited by Yong Shi

More articles in Annals of Data Science from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-20
Handle: RePEc:spr:aodasc:v:9:y:2022:i:4:d:10.1007_s40745-022-00389-6