EconPapers    
Economics at your fingertips  
 

AHerfReLU: A Novel Adaptive Activation Function Enhancing Deep Neural Network Performance

Abaid Ullah, Muhammad Imran, Muhammad Abdul Basit, Madeeha Tahir and Jihad Younis

Complexity, 2025, vol. 2025, 1-17

Abstract: In deep learning, the choice of activation function plays a vital role in enhancing model performance. We propose AHerfReLU, a novel activation function that combines the rectified linear unit (ReLU) function with the error function (erf), complemented by a regularization term 1/1+x2, ensuring smooth gradients even for negative inputs. The function is zero centered, bounded below, and nonmonotonic, offering significant advantages over traditional activation functions like ReLU. We compare AHerfReLU with 10 adaptive activation functions and state-of-the-art activation functions, including ReLU, Swish, and Mish. Experimental results show that replacing ReLU with AHerfReLU leads to 3.18% improvement in Top-1 accuracy on the LeNet network for the CIFAR100 dataset, 0.63% improvement on CIFAR10%, and 1.3% improvement in mean average precision (mAP) on the SSD300 model in the Pascal VOC dataset. Our results demonstrate that AHerfReLU enhances model performance, offering improved accuracy, loss reduction, and convergence stability. The function outperforms existing activation functions, providing a promising alternative for deep learning tasks.

Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://downloads.hindawi.com/journals/complexity/2025/8233876.pdf (application/pdf)
http://downloads.hindawi.com/journals/complexity/2025/8233876.xml (application/xml)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hin:complx:8233876

DOI: 10.1155/cplx/8233876

Access Statistics for this article

More articles in Complexity from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().

 
Page updated 2025-05-05
Handle: RePEc:hin:complx:8233876