EconPapers    
Economics at your fingertips  
 

Neural Networks as Positive Linear Operators

George A. Anastassiou ()
Additional contact information
George A. Anastassiou: Department of Mathematical Sciences, University of Memphis, Memphis, TN 38152, USA

Mathematics, 2025, vol. 13, issue 7, 1-17

Abstract: Basic neural network operators are interpreted as positive linear operators and the related general theory applies to them. These operators are induced by a symmetrized density function deriving from the parametrized and deformed hyperbolic tangent activation function. I explore the space of continuous functions on a compact interval of the real line to the reals. I study quantitatively the rate of convergence of these neural network operators to the unit operator. The studied inequalities involve the modulus of continuity of the function under approximation or its derivative. I produce uniform and L p , p ≥ 1 , approximation results via these inequalities. The convexity of functions is also taken into consideration.

Keywords: neural network operators; positive linear operators; modulus of continuity; quantitative approximation to the unit (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/7/1112/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/7/1112/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:7:p:1112-:d:1622524

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-04-05
Handle: RePEc:gam:jmathe:v:13:y:2025:i:7:p:1112-:d:1622524