Sinusoidal Approximation Theorem for Kolmogorov–Arnold Networks
Sergei Gleyzer,
Hanh Nguyen,
Dinesh P. Ramakrishnan and
Eric A. F. Reinhardt ()
Additional contact information
Sergei Gleyzer: Department of Physics and Astronomy, University of Alabama, Tuscaloosa, AL 35405, USA
Hanh Nguyen: Department of Mathematics, University of Alabama, Tuscaloosa, AL 35405, USA
Dinesh P. Ramakrishnan: Department of Physics and Astronomy, University of Alabama, Tuscaloosa, AL 35405, USA
Eric A. F. Reinhardt: Department of Physics and Astronomy, University of Alabama, Tuscaloosa, AL 35405, USA
Mathematics, 2025, vol. 13, issue 19, 1-15
Abstract:
The Kolmogorov–Arnold representation theorem states that any continuous multivariable function can be exactly represented as a finite superposition of continuous single-variable functions. Subsequent simplifications of this representation involve expressing these functions as parameterized sums of a smaller number of unique monotonic functions. Kolmogorov–Arnold Networks (KANs) have been recently proposed as an alternative to multilayer perceptrons. KANs feature learnable nonlinear activations applied directly to input values, modeled as weighted sums of basis spline functions. This approach replaces the linear transformations and sigmoidal post-activations used in traditional perceptrons. In this work, we propose a novel KAN variant by replacing both the inner and outer functions in the Kolmogorov–Arnold representation with weighted sinusoidal functions of learnable frequencies. We particularly fix the phases of the sinusoidal activations to linearly spaced constant values and provide a proof of their theoretical validity. We also conduct numerical experiments to evaluate its performance on a range of multivariable functions, comparing it with fixed-frequency Fourier transform methods, basis spline KANs (B-SplineKANs), and multilayer perceptrons (MLPs). We show that it outperforms the fixed-frequency Fourier transform B-SplineKAN and achieves comparable performance to MLP.
Keywords: approximation; machine learning; representation; periodic functions (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/13/19/3157/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/19/3157/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:19:p:3157-:d:1763665
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().