Optimal Training Parameters and Hidden Layer Neuron Number of Two-Layer Perceptron for Generalised Scaled Object Classification Problem
Romanuke Vadim ()
Additional contact information
Romanuke Vadim: Khmelnitsky National University
Information Technology and Management Science, 2015, vol. 18, issue 1, 42-48
Abstract:
The research is focused on optimising two-layer perceptron for generalised scaled object classification problem. The optimisation criterion is minimisation of inaccuracy. The inaccuracy depends on training parameters and hidden layer neuron number. After its statistics is accumulated, minimisation is executed by a numerical search. Perceptron is optimised additionally by extra training. As it is done, the classification error percentage does not exceed 3 % in case of the worst scale distortion.
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://doi.org/10.1515/itms-2015-0007 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:vrs:itmasc:v:18:y:2015:i:1:p:42-48:n:7
DOI: 10.1515/itms-2015-0007
Access Statistics for this article
Information Technology and Management Science is currently edited by J. Merkurjevs
More articles in Information Technology and Management Science from Sciendo
Bibliographic data for series maintained by Peter Golla ().