1. Computational event-driven vision sensors for in-sensor spiking neural networks
- Author
-
Zhou, Yue, Fu, Jiawei, Chen, Zirui, Zhuge, Fuwei, Wang, Yasai, Yan, Jianmin, Ma, Sijie, Xu, Lin, Yuan, Huanmei, Chan, Man Sun, Miao, Xiangshui, He, Yuhui, Chai, Yang, Zhou, Yue, Fu, Jiawei, Chen, Zirui, Zhuge, Fuwei, Wang, Yasai, Yan, Jianmin, Ma, Sijie, Xu, Lin, Yuan, Huanmei, Chan, Man Sun, Miao, Xiangshui, He, Yuhui, and Chai, Yang
- Abstract
Neuromorphic event-based image sensors capture only the dynamic motion in a scene, which is then transferred to computation units for motion recognition. This approach, however, leads to time latency and can be power consuming. Here we report computational event-driven vision sensors that capture and directly convert dynamic motion into programmable, sparse and informative spiking signals. The sensors can be used to form a spiking neural network for motion recognition. Each individual vision sensor consists of two parallel photodiodes with opposite polarities and has a temporal resolution of 5 mu s. In response to changes in light intensity, the sensors generate spiking signals with different amplitudes and polarities by electrically programming their individual photoresponsivity. The non-volatile and multilevel photoresponsivity of the vision sensors can emulate synaptic weights and can be used to create an in-sensor spiking neural network. Our computational event-driven vision sensor approach eliminates redundant data during the sensing process, as well as the need for data transfer between sensors and computation units. A spiking neural network that is based on event-driven vision sensors can be created using two parallel photodiodes of opposite polarities that output programmable spike signal trains in response to changes in light intensity.
- Published
- 2023