Emergent Self‐Adaptation in an Integrated Photonic Neural Network for Backpropagation‐Free Learning

Abstract Plastic self‐adaptation, nonlinear recurrent dynamics and multi‐scale memory are desired features in hardware implementations of neural networks, because they enable them to learn, adapt, and process information similarly to the way biological brains do. In this work, these properties occur...

Full description

Saved in:
Bibliographic Details
Main Authors: Alessio Lugnan, Samarth Aggarwal, Frank Brückerhoff‐Plückelmann, C. David Wright, Wolfram H. P. Pernice, Harish Bhaskaran, Peter Bienstman
Format: Article
Language:English
Published: Wiley 2025-01-01
Series:Advanced Science
Subjects:
Online Access:https://doi.org/10.1002/advs.202404920
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Plastic self‐adaptation, nonlinear recurrent dynamics and multi‐scale memory are desired features in hardware implementations of neural networks, because they enable them to learn, adapt, and process information similarly to the way biological brains do. In this work, these properties occurring in arrays of photonic neurons are experimentally demonstrated. Importantly, this is realized autonomously in an emergent fashion, without the need for an external controller setting weights and without explicit feedback of a global reward signal. Using a hierarchy of such arrays coupled to a backpropagation‐free training algorithm based on simple logistic regression, a performance of 98.2% is achieved on the MNIST task, a popular benchmark task looking at classification of written digits. The plastic nodes consist of silicon photonics microring resonators covered by a patch of phase‐change material that implements nonvolatile memory. The system is compact, robust, and straightforward to scale up through the use of multiple wavelengths. Moreover, it constitutes a unique platform to test and efficiently implement biologically plausible learning schemes at a high processing speed.
ISSN:2198-3844