Local kernel renormalization as a mechanism for feature learning in overparametrized convolutional neural networks
Abstract Empirical evidence shows that fully-connected neural networks in the infinite-width limit (lazy training) eventually outperform their finite-width counterparts in most computer vision tasks; on the other hand, modern architectures with convolutional layers often achieve optimal performances...
Saved in:
Main Authors: | R. Aiudi, R. Pacelli, P. Baglioni, A. Vezzani, R. Burioni, P. Rotondo |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Nature Communications |
Online Access: | https://doi.org/10.1038/s41467-024-55229-3 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Mass Renormalization in the Nelson Model
by: Fumio Hiroshima, et al.
Published: (2017-01-01) -
Renormalization group analysis for thermal turbulence
by: D. N. Riahi
Published: (1997-01-01) -
Renormalization Group Equation for Tsallis Statistics
by: Airton Deppman
Published: (2018-01-01) -
Stochastic renormalization group and gradient flow
by: Andrea Carosso
Published: (2020-01-01) -
Local and Deep Features Based Convolutional Neural Network Frameworks for Brain MRI Anomaly Detection
by: Sajad Einy, et al.
Published: (2022-01-01)