Global Universality of the Two-Layer Neural Network with the k-Rectified Linear Unit
This paper concerns the universality of the two-layer neural network with the k-rectified linear unit activation function with k=1,2,… with a suitable norm without any restriction on the shape of the domain in the real line. This type of result is called global universality, which extends the previo...
Saved in:
Main Authors: | Naoya Hatano, Masahiro Ikeda, Isao Ishikawa, Yoshihiro Sawano |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2024-01-01
|
Series: | Journal of Function Spaces |
Online Access: | http://dx.doi.org/10.1155/2024/3262798 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
AReLU: Agile Rectified Linear Unit for Improving Lightweight Convolutional Neural Networks
by: Fu Chen, et al.
Published: (2025-01-01) -
Feature Representations Using the Reflected Rectified Linear Unit (RReLU) Activation
by: Chaity Banerjee, et al.
Published: (2020-06-01) -
Retinopathy Disease Detection and Classification Using a Coordinate Attention Module-Based Convolutional Neural Network with Leaky Rectified Linear Unit
by: Pravin Balaso Chopade, et al.
Published: (2025-01-01) -
Back-action supercurrent rectifiers
by: Daniel Margineda, et al.
Published: (2025-01-01) -
A perturbation-based model for rectifier circuits
by: Vipin B. Vats, et al.
Published: (2006-01-01)