EW-CACTUs-MAML: A Robust Metalearning System for Rapid Classification on a Large Number of Tasks
Wen-Feng Wang,
Jingjing Zhang,
Peng An and
Zhijie Wang
Complexity, 2022, vol. 2022, 1-8
Abstract:
This study aims to develop a robust metalearning system for rapid classification on a large number of tasks. The model-agnostic metalearning (MAML) with the CACTUs method (clustering to automatically construct tasks for unsupervised metalearning) is improved as EW-CACTUs-MAML after integrated with the entropy weight (EW) method. Few-shot mechanisms are introduced in the deep network for efficient learning of a large number of tasks. The process of implementation is theoretically interpreted as “gene intelligence.†Validation of EW-CACTUs-MAML on a typical dataset (Omniglot) indicates an accuracy of 97.42%, performing better than CACTUs-MAML (validation accuracy = 97.22%). At the end of this paper, the availability of our thoughts to improve another metalearning system (EW-CACTUs-ProtoNets) is also preliminarily discussed based on a cross-validation on another typical dataset (Miniimagenet).
Date: 2022
References: Add references at CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/complexity/2022/7330823.pdf (application/pdf)
http://downloads.hindawi.com/journals/complexity/2022/7330823.xml (application/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:complx:7330823
DOI: 10.1155/2022/7330823
Access Statistics for this article
More articles in Complexity from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().