DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions

Journal Article · · Frontiers in Artificial Intelligence

Recent work has established an alternative to traditional multi-layer perceptron neural networks in the form of Kolmogorov-Arnold Networks (KAN). The general KAN framework uses learnable activation functions on the edges of the computational graph followed by summation on nodes. The learnable edge activation functions in the original implementation are basis spline functions (B-Spline). Here, we present a model in which learnable grids of B-Spline activation functions are replaced by grids of re-weighted sine functions (SineKAN). We evaluate numerical performance of our model on a benchmark vision task. We show that our model can perform better than or comparable to B-Spline KAN models and an alternative KAN implementation based on periodic cosine and sine functions representing a Fourier Series. Further, we show that SineKAN has numerical accuracy that could scale comparably to dense neural networks (DNNs). Compared to the two baseline KAN models, SineKAN achieves a substantial speed increase at all hidden layer sizes, batch sizes, and depths. Current advantage of DNNs due to hardware and software optimizations are discussed along with theoretical scaling. Additionally, properties of SineKAN compared to other KAN implementations and current limitations are also discussed.

Sponsoring Organization:
USDOE
Grant/Contract Number:
SC0012447
OSTI ID:
2500918
Journal Information:
Frontiers in Artificial Intelligence, Journal Name: Frontiers in Artificial Intelligence Vol. 7; ISSN 2624-8212
Publisher:
Frontiers Media SACopyright Statement
Country of Publication:
Switzerland
Language:
English

References (10)

On a Constructive Proof of Kolmogorov’s Superposition Theorem journal May 2009
Multilayer feedforward networks are universal approximators journal January 1989
Learning representations by back-propagating errors journal October 1986
Neural networks with periodic and monotonic activation functions: a comparative study in classification problems conference January 1999
Gradient-based learning applied to document recognition journal January 1998
Deep Residual Learning for Image Recognition conference June 2016
There exists a neural network that does not make avoidable mistakes conference July 1988
Handwritten digit recognition using multilayer feedforward neural networks with periodic and monotonic activation functions conference August 2002
The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web] journal October 2012
Backpropagation Applied to Handwritten Zip Code Recognition journal December 1989

Similar Records

SPIKANs: separable physics-informed Kolmogorov–Arnold networks
Journal Article · 2025 · Machine Learning: Science and Technology · OSTI ID:3015493

Kolmogorov-Arnold wavefunctions
Journal Article · 2025 · Physical Review. C · OSTI ID:3007459

Baseflow Identification via Explainable AI With Kolmogorov‐Arnold Networks
Journal Article · 2025 · Journal of Geophysical Research: Machine Learning and Computation · OSTI ID:3004776

Related Subjects