EconPapers    
Economics at your fingertips  
 

Fully Kernected Neural Networks

Wei Zhang, Zhi Han, Xiai Chen, Baichen Liu, Huidi Jia and Yandong Tang

Journal of Mathematics, 2023, vol. 2023, 1-9

Abstract:

In this paper, we apply kernel methods to deep convolutional neural network (DCNN) to improve its nonlinear ability. DCNNs have achieved significant improvement in many computer vision tasks. For an image classification task, the accuracy comes to saturation when the depth and width of network are enough and appropriate. The saturation accuracy will not rise even by increasing the depth and width. We find that improving nonlinear ability of DCNNs can break through the saturation accuracy. In a DCNN, the former layer is more inclined to extract features and the latter layer is more inclined to classify features. Therefore, we apply kernel methods at the last fully connected layer to implicitly map features to a higher-dimensional space to improve nonlinear ability so that the network achieves better linear separability. Also, we name the network as fully kernected neural networks (fully connected neural networks with kernel methods). Our experiment result shows that fully kernected neural networks achieve higher classification accuracy and faster convergence rate than baseline networks.

Date: 2023
References: Add references at CitEc
Citations:

Downloads: (external link)
http://downloads.hindawi.com/journals/JMATH/2023/1539436.pdf (application/pdf)
http://downloads.hindawi.com/journals/JMATH/2023/1539436.xml (text/xml)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hin:jjmath:1539436

DOI: 10.1155/2023/1539436

Access Statistics for this article

More articles in Journal of Mathematics from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().

 
Page updated 2025-03-19
Handle: RePEc:hin:jjmath:1539436