Local kernel renormalization as a mechanism for feature learning in overparametrized convolutional neural networks
Abstract Empirical evidence shows that fully-connected neural networks in the infinite-width limit (lazy training) eventually outperform their finite-width counterparts in most computer vision tasks; on the other hand, modern architectures with convolutional layers often achieve optimal performances...
Saved in:
| Main Authors: | R. Aiudi, R. Pacelli, P. Baglioni, A. Vezzani, R. Burioni, P. Rotondo |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-01-01
|
| Series: | Nature Communications |
| Online Access: | https://doi.org/10.1038/s41467-024-55229-3 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A Sliding‐Kernel Computation‐In‐Memory Architecture for Convolutional Neural Network
by: Yushen Hu, et al.
Published: (2024-12-01) -
Local and Deep Features Based Convolutional Neural Network Frameworks for Brain MRI Anomaly Detection
by: Sajad Einy, et al.
Published: (2022-01-01) -
Mass Renormalization in the Nelson Model
by: Fumio Hiroshima, et al.
Published: (2017-01-01) -
Adaptive Kernel Convolutional Stereo Matching Recurrent Network
by: Jiamian Wang, et al.
Published: (2024-11-01) -
Stability of Cauchy–Stieltjes Kernel Families by Free and Boolean Convolutions Product
by: Ayed. R. A. Alanzi, et al.
Published: (2024-11-01)