Feature Representations Using the Reflected Rectified Linear Unit (RReLU) Activation
Deep Neural Networks (DNNs) have become the tool of choice for machine learning practitioners today. One important aspect of designing a neural network is the choice of the activation function to be used at the neurons of the different layers. In this work, we introduce a four-output activation func...
Saved in:
Main Authors: | Chaity Banerjee, Tathagata Mukherjee, Eduardo Pasiliao Jr. |
---|---|
Format: | Article |
Language: | English |
Published: |
Tsinghua University Press
2020-06-01
|
Series: | Big Data Mining and Analytics |
Subjects: | |
Online Access: | https://www.sciopen.com/article/10.26599/BDMA.2019.9020024 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Retinopathy Disease Detection and Classification Using a Coordinate Attention Module-Based Convolutional Neural Network with Leaky Rectified Linear Unit
by: Pravin Balaso Chopade, et al.
Published: (2025-01-01) -
Dual Input Interleaved Three-Phase Rectifier In Discontinuous Conduction Mode For Application In Small Wind Turbines
by: Guilherme M. Todys, et al.
Published: (2025-01-01) -
Study on Rectifier Technology in Forebay of Pumping Station
by: SU Zhengyang, et al.
Published: (2020-01-01) -
A low‐rating 40‐pulse AC–DC rectifier based on a new passive harmonic mitigation circuit
by: Rohollah Abdollahi, et al.
Published: (2022-12-01) -
Advanced Rectifier Technologies for Electrolysis-Based Hydrogen Production: A Comparative Study and Real-World Applications
by: Yan Gao, et al.
Published: (2024-12-01)