First-Order Sparse TSK Nonstationary Fuzzy Neural Network Based on the Mean Shift Algorithm and the Group Lasso Regularization
Bingjie Zhang,
Jian Wang (),
Xiaoling Gong,
Zhanglei Shi,
Chao Zhang (),
Kai Zhang,
El-Sayed M. El-Alfy and
Sergey V. Ablameyko
Additional contact information
Bingjie Zhang: School of Mathematical Sciences, Dalian University of Technology, Dalian 116024, China
Jian Wang: College of Science, China University of Petroleum (East China), Qingdao 266580, China
Xiaoling Gong: College of Control Science and Engineering, China University of Petroleum (East China), Qingdao 266580, China
Zhanglei Shi: College of Science, China University of Petroleum (East China), Qingdao 266580, China
Chao Zhang: School of Mathematical Sciences, Dalian University of Technology, Dalian 116024, China
Kai Zhang: School of Petroleum Engineering, China University of Petroleum (East China), Qingdao 266580, China
El-Sayed M. El-Alfy: Fellow SDAIA-KFUPM Joint Research Center for Artificial Intelligence, Interdisciplinary Research Center of Intelligent Secure Systems, Information and Computer Science Department, King Fahd University of Petroleum and Minerals, Dhahran 31261, Saudi Arabia
Sergey V. Ablameyko: Faculty of Applied Mathematics and Computer Science, Belarusian State University, 220030 Minsk, Belarus
Mathematics, 2023, vol. 12, issue 1, 1-14
Abstract:
Nonstationary fuzzy inference systems (NFIS) are able to tackle uncertainties and avoid the difficulty of type-reduction operation. Combining NFIS and neural network, a first-order sparse TSK nonstationary fuzzy neural network (SNFNN-1) is proposed in this paper to improve the interpretability/translatability of neural networks and the self-learning ability of fuzzy rules/sets. The whole architecture of SNFNN-1 can be considered as an integrated model of multiple sub-networks with a variation in center, variation in width or variation in noise. Thus, it is able to model both “intraexpert” and “interexpert” variability. There are two techniques adopted in this network: the Mean Shift-based fuzzy partition and the Group Lasso-based rule selection, which can adaptively generate a suitable number of clusters and select important fuzzy rules, respectively. Quantitative experiments on six UCI datasets demonstrate the effectiveness and robustness of the proposed model.
Keywords: nonstationary neuro-fuzzy network; mean shift; group lasso; rule reduction (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/12/1/120/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/1/120/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2023:i:1:p:120-:d:1310246
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().