Back to Search Start Over

Using photonic reservoirs as preprocessors for deep neural networks

Authors :
Ian Bauwens
Guy Van der Sande
Peter Bienstman
Guy Verschaffelt
Source :
Frontiers in Physics, Vol 10 (2022)
Publication Year :
2022
Publisher :
Frontiers Media S.A., 2022.

Abstract

Artificial neural networks are very time consuming and energy intensive to train, especially when increasing the size of the neural network in an attempt to improve the performance. In this paper, we propose to preprocess the input data of a deep neural network using a reservoir, which has originally been introduced in the framework of reservoir computing. The key idea of this paper is to use such a reservoir to transform the input data into a state in a higher dimensional state-space, which allows the deep neural network to process the data with improved performance. We focus on photonic reservoirs because of their fast computation times and low-energy consumption. Based on numerical simulations of delay-based reservoirs using a semiconductor laser, we show that using such preprocessed data results in an improved performance of deep neural networks. Furthermore, we show that we do not need to carefully fine-tune the parameters of the preprocessing reservoir.

Details

Language :
English
ISSN :
2296424X
Volume :
10
Database :
Directory of Open Access Journals
Journal :
Frontiers in Physics
Publication Type :
Academic Journal
Accession number :
edsdoj.585a7384bfdf4858b8d0ada5972824e3
Document Type :
article
Full Text :
https://doi.org/10.3389/fphy.2022.1051941