Randomized Gauss–Seidel iterative algorithms for Extreme Learning Machines
Chinnamuthu Subramani,
Ravi Prasad K. Jagannath and
Venkatanareshbabu Kuppili
Physica A: Statistical Mechanics and its Applications, 2025, vol. 666, issue C
Abstract:
Extreme Learning Machines (ELMs) are a class of single hidden-layer feedforward neural networks known for their rapid training process, structural simplicity, and strong generalization capabilities. ELM training requires solving a system of linear equations, where solution accuracy directly impacts model performance. However, conventional ELMs rely on the Moore–Penrose inverse, which is computationally expensive, memory-intensive, and numerically unstable in ill-conditioned problems. Additionally, stabilizing matrix inversion requires a hyperparameter, whose optimal selection further increases computational complexity. Iterative numerical techniques offer a promising alternative; however, the stochastic nature of the feature matrix challenges deterministic methods, while stochastic gradient approaches are hyperparameter-sensitive and prone to local minima. To address these limitations, this study introduces randomized iterative algorithms that solve the original linear system without requiring matrix inversion or full-system computation, instead leveraging random subsets of data in a hyperparameter-free framework. Although these methods incorporate randomness, they are not arbitrary but remain system-dependent, dynamically adapting to the structure of the feature matrix. Theoretical analysis establishes upper bounds on the expected number of iterations, expressed in terms of statistical properties of the feature matrix, providing insights into near-singularity, condition number, and network size. Empirical evaluations on classification datasets demonstrate that the proposed methods consistently outperform conventional ELM, deterministic solvers, and gradient descent-based methods in accuracy, efficiency, and robustness. Statistical validation using Friedman’s rank test and Wilcoxon post-hoc analysis confirms the superior performance and reliability of these randomized algorithms, establishing them as a computationally efficient and numerically stable alternative to existing approaches.
Keywords: Least squares problem; Coordinate descent method; Randomized Gauss–Seidel method; Randomized Extended Gauss–Seidel method; Stochastic gradient descent; Adam optimizer (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0378437125001670
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:666:y:2025:i:c:s0378437125001670
DOI: 10.1016/j.physa.2025.130515
Access Statistics for this article
Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis
More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().