Back to Search Start Over

Neuromorphic Engineering Needs Closed-Loop Benchmarks.

Authors :
Milde, Moritz B.
Afshar, Saeed
Xu, Ying
Marcireau, Alexandre
Joubert, Damien
Ramesh, Bharath
Bethi, Yeshwanth
Ralph, Nicholas O.
El Arja, Sami
Dennler, Nik
van Schaik, André
Cohen, Gregory
Source :
Frontiers in Neuroscience; 2/14/2022, Vol. 16, p1-16, 16p
Publication Year :
2022

Abstract

Neuromorphic engineering aims to build (autonomous) systems by mimicking biological systems. It is motivated by the observation that biological organisms—from algae to primates—excel in sensing their environment, reacting promptly to their perils and opportunities. Furthermore, they do so more resiliently than our most advanced machines, at a fraction of the power consumption. It follows that the performance of neuromorphic systems should be evaluated in terms of real-time operation, power consumption, and resiliency to real-world perturbations and noise using task-relevant evaluation metrics. Yet, following in the footsteps of conventional machine learning, most neuromorphic benchmarks rely on recorded datasets that foster sensing accuracy as the primary measure for performance. Sensing accuracy is but an arbitrary proxy for the actual system's goal—taking a good decision in a timely manner. Moreover, static datasets hinder our ability to study and compare closed-loop sensing and control strategies that are central to survival for biological organisms. This article makes the case for a renewed focus on closed-loop benchmarks involving real-world tasks. Such benchmarks will be crucial in developing and progressing neuromorphic Intelligence. The shift towards dynamic real-world benchmarking tasks should usher in richer, more resilient, and robust artificially intelligent systems in the future. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
16624548
Volume :
16
Database :
Complementary Index
Journal :
Frontiers in Neuroscience
Publication Type :
Academic Journal
Accession number :
155257318
Full Text :
https://doi.org/10.3389/fnins.2022.813555