Back to Search Start Over

Invariant Feature Extraction From Event Based Stimuli

Authors :
Chandrapala, Thusitha N.
Shi, Bertram E
Chandrapala, Thusitha N.
Shi, Bertram E
Publication Year :
2016

Abstract

We propose a novel architecture, the event-based Generative Adaptive Subspace Self-Organizing Map (GASSOM) for learning and extracting invariant representations from event streams originating from neuromorphic vision sensors. The framework is inspired by feed-forward cortical models for visual processing. The model, which is based on the concepts of sparsity and temporal slowness, is able to learn feature extractors that resemble neurons in the primary visual cortex. Layers of units in the proposed model can be cascaded to learn feature extractors with different levels of complexity and selectivity. We explore the applicability of the framework on real world tasks by using the learned network for object recognition. The proposed model achieve higher classification accuracy compared to other state-of-The-Art event based processing methods. Our results also demonstrate the generality and robustness of the method, as the recognizers for different data sets and different tasks all used the same set of learned feature detectors, which were trained on data collected independently of the testing data. © 2016 IEEE.

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1007129047
Document Type :
Electronic Resource