EconPapers    
Economics at your fingertips  
 

Composition of Activation Functions and the Reduction to Finite Domain

George A. Anastassiou ()
Additional contact information
George A. Anastassiou: Department of Mathematical Sciences, University of Memphis, Memphis, TN 38152, USA

Mathematics, 2025, vol. 13, issue 19, 1-10

Abstract: This work takes up the task of the determination of the rate of pointwise and uniform convergences to the unit operator of the “normalized cusp neural network operators”. The cusp is a compact support activation function, which is the composition of two general activation functions having as domain the whole real line. These convergences are given via the modulus of continuity of the engaged function or its derivative in the form of Jackson type inequalities. The composition of activation functions aims to more flexible and powerful neural networks, introducing for the first time the reduction in infinite domains to the one domain of compact support.

Keywords: neural network approximation; cusp activation function; modulus of continuity; reduction of domain (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/19/3177/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/19/3177/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:19:p:3177-:d:1764543

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-10-04
Handle: RePEc:gam:jmathe:v:13:y:2025:i:19:p:3177-:d:1764543