SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions

Recent work has established an alternative to traditional multi-layer perceptron neural networks in the form of Kolmogorov-Arnold Networks (KAN). The general KAN framework uses learnable activation functions on the edges of the computational graph followed by summation on nodes. The learnable edge a...

Full description

Saved in:
Bibliographic Details
Main Authors: Eric Reinhardt, Dinesh Ramakrishnan, Sergei Gleyzer
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-01-01
Series:Frontiers in Artificial Intelligence
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frai.2024.1462952/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841527805309878272
author Eric Reinhardt
Dinesh Ramakrishnan
Sergei Gleyzer
author_facet Eric Reinhardt
Dinesh Ramakrishnan
Sergei Gleyzer
author_sort Eric Reinhardt
collection DOAJ
description Recent work has established an alternative to traditional multi-layer perceptron neural networks in the form of Kolmogorov-Arnold Networks (KAN). The general KAN framework uses learnable activation functions on the edges of the computational graph followed by summation on nodes. The learnable edge activation functions in the original implementation are basis spline functions (B-Spline). Here, we present a model in which learnable grids of B-Spline activation functions are replaced by grids of re-weighted sine functions (SineKAN). We evaluate numerical performance of our model on a benchmark vision task. We show that our model can perform better than or comparable to B-Spline KAN models and an alternative KAN implementation based on periodic cosine and sine functions representing a Fourier Series. Further, we show that SineKAN has numerical accuracy that could scale comparably to dense neural networks (DNNs). Compared to the two baseline KAN models, SineKAN achieves a substantial speed increase at all hidden layer sizes, batch sizes, and depths. Current advantage of DNNs due to hardware and software optimizations are discussed along with theoretical scaling. Additionally, properties of SineKAN compared to other KAN implementations and current limitations are also discussed.
format Article
id doaj-art-1c6069147ed14b9ebfaa51b395a844d0
institution Kabale University
issn 2624-8212
language English
publishDate 2025-01-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Artificial Intelligence
spelling doaj-art-1c6069147ed14b9ebfaa51b395a844d02025-01-15T06:10:59ZengFrontiers Media S.A.Frontiers in Artificial Intelligence2624-82122025-01-01710.3389/frai.2024.14629521462952SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functionsEric ReinhardtDinesh RamakrishnanSergei GleyzerRecent work has established an alternative to traditional multi-layer perceptron neural networks in the form of Kolmogorov-Arnold Networks (KAN). The general KAN framework uses learnable activation functions on the edges of the computational graph followed by summation on nodes. The learnable edge activation functions in the original implementation are basis spline functions (B-Spline). Here, we present a model in which learnable grids of B-Spline activation functions are replaced by grids of re-weighted sine functions (SineKAN). We evaluate numerical performance of our model on a benchmark vision task. We show that our model can perform better than or comparable to B-Spline KAN models and an alternative KAN implementation based on periodic cosine and sine functions representing a Fourier Series. Further, we show that SineKAN has numerical accuracy that could scale comparably to dense neural networks (DNNs). Compared to the two baseline KAN models, SineKAN achieves a substantial speed increase at all hidden layer sizes, batch sizes, and depths. Current advantage of DNNs due to hardware and software optimizations are discussed along with theoretical scaling. Additionally, properties of SineKAN compared to other KAN implementations and current limitations are also discussed.https://www.frontiersin.org/articles/10.3389/frai.2024.1462952/fullmachine learning (ML)periodic functionKolmogorov-Arnold RepresentationKolmogorov-Arnold Networks (KANs)sinusoidal activation function
spellingShingle Eric Reinhardt
Dinesh Ramakrishnan
Sergei Gleyzer
SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions
Frontiers in Artificial Intelligence
machine learning (ML)
periodic function
Kolmogorov-Arnold Representation
Kolmogorov-Arnold Networks (KANs)
sinusoidal activation function
title SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions
title_full SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions
title_fullStr SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions
title_full_unstemmed SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions
title_short SineKAN: Kolmogorov-Arnold Networks using sinusoidal activation functions
title_sort sinekan kolmogorov arnold networks using sinusoidal activation functions
topic machine learning (ML)
periodic function
Kolmogorov-Arnold Representation
Kolmogorov-Arnold Networks (KANs)
sinusoidal activation function
url https://www.frontiersin.org/articles/10.3389/frai.2024.1462952/full
work_keys_str_mv AT ericreinhardt sinekankolmogorovarnoldnetworksusingsinusoidalactivationfunctions
AT dineshramakrishnan sinekankolmogorovarnoldnetworksusingsinusoidalactivationfunctions
AT sergeigleyzer sinekankolmogorovarnoldnetworksusingsinusoidalactivationfunctions