163 results on '"spike timing dependent plasticity"'
Search Results
2. Consciousness driven Spike Timing Dependent Plasticity
- Author
-
Yadav, Sushant, Chaudhary, Santosh, Kumar, Rajesh, and Nkomozepi, Pilani
- Published
- 2025
- Full Text
- View/download PDF
Catalog
3. Reinforcement Learning Control of Cart Pole System with Spike Timing Neural Network Actor-Critic Architecture
- Author
-
Markov, Borislav, Koprinkova-Hristova, Petia, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Koprinkova-Hristova, Petia, editor, and Kasabov, Nikola, editor more...
- Published
- 2025
- Full Text
- View/download PDF
4. Quasi Biologically Plausible Category Learning
- Author
-
Huyck, Christian, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Bramer, Max, editor, and Stahl, Frederic, editor more...
- Published
- 2025
- Full Text
- View/download PDF
5. A spiking binary neuron — detector of causal links
- Author
-
Kiselev, Mikhail V., Larionov, Denis Aleksandrovich, and Andrey, Urusov M.
- Subjects
spiking neural network ,binary neuron ,spike timing dependent plasticity ,dopamine-modulated plasticity ,anti-hebbian plasticity ,reinforcement learning ,neuromorphic hardware ,Physics ,QC1-999 - Abstract
Purpose. Causal relationship recognition is a fundamental operation in neural networks aimed at learning behavior, action planning, and inferring external world dynamics. This operation is particularly crucial for reinforcement learning (RL). In the context of spiking neural networks (SNNs), events are represented as spikes emitted by network neurons or input nodes. Detecting causal relationships within these events is essential for effective RL implementation. Methods. This research paper presents a novel approach to realize causal relationship recognition using a simple spiking binary neuron. The proposed method leverages specially designed synaptic plasticity rules, which are both straightforward and efficient. Notably, our approach accounts for the temporal aspects of detected causal links and accommodates the representation of spiking signals as single spikes or tight spike sequences (bursts), as observed in biological brains. Furthermore, this study places a strong emphasis on the hardware-friendliness of the proposed models, ensuring their efficient implementation on modern and future neuroprocessors. Results. Being compared with precise machine learning techniques, such as decision tree algorithms and convolutional neural networks, our neuron demonstrates satisfactory accuracy despite its simplicity. Conclusion. We introduce a multi-neuron structure capable of operating in more complex environments with enhanced accuracy, making it a promising candidate for the advancement of RL applications in SNNs. more...
- Published
- 2024
- Full Text
- View/download PDF
6. Real-time execution of SNN models with synaptic plasticity for handwritten digit recognition on SIMD hardware.
- Author
-
Vallejo-Mancero, Bernardo, Madrenas, Jordi, and Zapata, Mireya
- Subjects
ARTIFICIAL neural networks ,PROCESS capability ,DATABASES ,PARALLEL processing ,NEUROPLASTICITY - Abstract
Recent advancements in neuromorphic computing have led to the development of hardware architectures inspired by Spiking Neural Networks (SNNs) to emulate the efficiency and parallel processing capabilities of the human brain. This work focuses on testing the HEENS architecture, specifically designed for high parallel processing and biological realism in SNN emulation, implemented on a ZYNQ family FPGA. The study applies this architecture to the classification of digits using the well-known MNIST database. The image resolutions were adjusted to match HEENS' processing capacity. Results were compared with existing work, demonstrating HEENS' performance comparable to other solutions. This study highlights the importance of balancing accuracy and efficiency in the execution of applications. HEENS offers a flexible solution for SNN emulation, allowing for the implementation of programmable neural and synapticmodels. It encourages the exploration of novel algorithms and network architectures, providing an alternative for real-time processing with efficient energy consumption. [ABSTRACT FROM AUTHOR] more...
- Published
- 2024
- Full Text
- View/download PDF
7. Real-time execution of SNN models with synaptic plasticity for handwritten digit recognition on SIMD hardware
- Author
-
Bernardo Vallejo-Mancero, Jordi Madrenas, and Mireya Zapata
- Subjects
HEENS ,neuromorphic hardware ,spiking neural network ,LIF model ,Spike Timing Dependent Plasticity ,MNIST dataset ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Recent advancements in neuromorphic computing have led to the development of hardware architectures inspired by Spiking Neural Networks (SNNs) to emulate the efficiency and parallel processing capabilities of the human brain. This work focuses on testing the HEENS architecture, specifically designed for high parallel processing and biological realism in SNN emulation, implemented on a ZYNQ family FPGA. The study applies this architecture to the classification of digits using the well-known MNIST database. The image resolutions were adjusted to match HEENS' processing capacity. Results were compared with existing work, demonstrating HEENS' performance comparable to other solutions. This study highlights the importance of balancing accuracy and efficiency in the execution of applications. HEENS offers a flexible solution for SNN emulation, allowing for the implementation of programmable neural and synaptic models. It encourages the exploration of novel algorithms and network architectures, providing an alternative for real-time processing with efficient energy consumption. more...
- Published
- 2024
- Full Text
- View/download PDF
8. Theoretical investigations into principles of topographic map formation and applications
- Author
-
Gale, Nicholas, Eglen, Stephen, and Franze, Kristian
- Subjects
Chemotaxis ,Datascience ,Dynamical Systems ,EphA3 ,GPU acceleration ,Neural Development ,Retintopy ,Spike timing dependent plasticity ,Superior Colliculus ,Travelling Salesman Problem - Abstract
Topographic maps are ubiquitous brain structures that are fundamental to sensory and higher order systems and are composed of connections between two regions obeying the relationship: physically neighbouring cells in a pre-synaptic region connect to physically neighbouring cells in the post-synaptic region. The developmental principles driving topographic map formation are usually studied within the context of genetic perturbations coupled to high resolution measurements and for these the mouse retinotopic map from retina to superior colliculus has emerged as a useful experimental context. Modelling coupled with genetic perturbation experiments has revealed three key developmental mechanisms: neural activity, chemotaxis, and competition. Some principal challenges in modelling this development include explaining the role of the spatio-temporal structure of patterned neural activity, determining the relative interaction between developmental components, and developing models that are sufficiently computationally efficient that statistical methodologies can be applied to them. Neural activity is a well measured component of retinotopic development and several independent measurement techniques have recorded the existence of spatiotemporally patterned waves at key critical points during development. Existing modelling methodologies reduce this rich spatiotemporal context into a distance dependent correlation function and have subsequently had challenges making quantitative predictions about the effect of manipulating these activity patterns. A neural field theory approach is used to develop mathematical theory which can incorporate these spatiotemporal structures. Bayesian MCMC regression analysis is performed on biological measurements to assess the accuracy of the model and make predictions about the time-scale on which activity operates. This time scale is tuned to the length of an average wave pattern suggesting the system is integrating all information in these waves. The interaction between chemotaxis and neural activity has historically been thought of as linearly independent. A recent study which perturbs both developmental mechanisms simultaneously has suggested that these two are highly stochastic and regular development depends on a critical fine- tuned balance between the two: the heterozygous phenotype was observed to present as both a wild-type and homozygote for different specimens. This hypothesis is tested against the data-set used to generate it. Recreating the entire experimental pipeline in silico with the most parsimonious existing model is able to account for the data without the need to appeal to stochasticity in the mechanisms. A statistical analysis demonstrates that the heterozygous state does not significantly overlap with the heterozygotes and that the stochasticity is likely due to the measurement technique. The existing models are computationally demanding; at least O(n3 ) in the number of retinal cells instantiated by the model. This computational demand renders these classes of models incapable of performing statistical regression and means that their parameters spaces are largely unexplored. A modelling framework which integrates the core operating mechanisms of the model is developed and when implemented on modern GPU computational architectures is able to achieve a near- linear time complexity scaling. This model is demonstrated to capture the explanatory power of existing modelling methodologies. Finally, the role of competition is explored in a dimensional reduction framework: the Elastic Net. The Elastic Net has been used both as a heuristic optimiser (validated on the NP-complete Travel- ling Salesman Problem) and to explain the development of cortical feature maps. The addition of competition is demonstrated to act as a counter-measure to the retinotopic distorting components of the Elastic Net as a cortical map generator. Further analysis demonstrates that competition substantially improves heuristic performance on the Travelling Salesman Problem making it competitive against state of the art solvers when performance is normalised by solution times. The heuristic converges on a length scaling law that is discussed in the context of wire-minimisation problem. more...
- Published
- 2022
- Full Text
- View/download PDF
9. A Spiking Neuron Synaptic Plasticity Model Optimized for Unsupervised Learning
- Author
-
Kiselev, Mikhail, Ivanitsky, Alexander, Ivanov, Dmitry, Larionov, Denis, Kacprzyk, Janusz, Series Editor, Kryzhanovsky, Boris, editor, Dunin-Barkowski, Witali, editor, Redko, Vladimir, editor, Tiumentsev, Yury, editor, and Klimov, Valentin, editor more...
- Published
- 2023
- Full Text
- View/download PDF
10. Toward Learning in Neuromorphic Circuits Based on Quantum Phase Slip Junctions
- Author
-
Cheng, Ran, Goteti, Uday S, Walker, Harrison, Krause, Keith M, Oeding, Luke, and Hamilton, Michael C
- Subjects
Biological Psychology ,Biomedical and Clinical Sciences ,Neurosciences ,Psychology ,Affordable and Clean Energy ,quantum phase slip junction ,Josephson junction ,neuromorphic computing ,spike timing dependent plasticity ,unsupervised learning ,coupled synapse networks ,Cognitive Sciences ,Biological psychology - Abstract
We explore the use of superconducting quantum phase slip junctions (QPSJs), an electromagnetic dual to Josephson Junctions (JJs), in neuromorphic circuits. These small circuits could serve as the building blocks of neuromorphic circuits for machine learning applications because they exhibit desirable properties such as inherent ultra-low energy per operation, high speed, dense integration, negligible loss, and natural spiking responses. In addition, they have a relatively straight-forward micro/nano fabrication, which shows promise for implementation of an enormous number of lossless interconnections that are required to realize complex neuromorphic systems. We simulate QPSJ-only, as well as hybrid QPSJ + JJ circuits for application in neuromorphic circuits including artificial synapses and neurons, as well as fan-in and fan-out circuits. We also design and simulate learning circuits, where a simplified spike timing dependent plasticity rule is realized to provide potential learning mechanisms. We also take an alternative approach, which shows potential to overcome some of the expected challenges of QPSJ-based neuromorphic circuits, via QPSJ-based charge islands coupled together to generate non-linear charge dynamics that result in a large number of programmable weights or non-volatile memory states. Notably, we show that these weights are a function of the timing and frequency of the input spiking signals and can be programmed using a small number of DC voltage bias signals, therefore exhibiting spike-timing and rate dependent plasticity, which are mechanisms to realize learning in neuromorphic circuits. more...
- Published
- 2021
11. TiN/Ti/HfO2/TiN memristive devices for neuromorphic computing: from synaptic plasticity to stochastic resonance.
- Author
-
Maldonado, David, Cantudo, Antonio, Perez, Eduardo, Romero-Zaliz, Rocio, Quesada, Emilio Perez-Bosch, Mahadevaiah, Mamathamba Kalishettyhalli, Jimenez-Molinos, Francisco, Wenger, Christian, and Roldan, Juan Bautista more...
- Subjects
STOCHASTIC resonance ,NEUROPLASTICITY ,TITANIUM nitride ,DEPENDENCY (Psychology) - Abstract
We characterize TiN/Ti/HfO2/TiN memristive devices for neuromorphic computing. We analyze dierent features that allow the devices to mimic biological synapses and present the models to reproduce analytically some of the data measured. In particular, we have measured the spike timing dependent plasticity behavior in our devices and later on we have modeled it. The spike timing dependent plasticity model was implemented as the learning rule of a spiking neural network that was trained to recognize the MNIST dataset. Variability is implemented and its influence on the network recognition accuracy is considered accounting for the number of neurons in the network and the number of training epochs. Finally, stochastic resonance is studied as another synaptic feature. It is shown that this eect is important and greatly depends on the noise statistical characteristics. [ABSTRACT FROM AUTHOR] more...
- Published
- 2023
- Full Text
- View/download PDF
12. Competitive Learning with Spiking Nets and Spike Timing Dependent Plasticity
- Author
-
Huyck, Christian, Erekpaine, Orume, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Bramer, Max, editor, and Stahl, Frederic, editor more...
- Published
- 2022
- Full Text
- View/download PDF
13. TiN/Ti/HfO2/TiN memristive devices for neuromorphic computing: from synaptic plasticity to stochastic resonance
- Author
-
David Maldonado, Antonio Cantudo, Eduardo Perez, Rocio Romero-Zaliz, Emilio Perez-Bosch Quesada, Mamathamba Kalishettyhalli Mahadevaiah, Francisco Jimenez-Molinos, Christian Wenger, and Juan Bautista Roldan more...
- Subjects
resistive switching devices ,neuromorphic computing ,synaptic behavior ,spike timing dependent plasticity ,stochastic resonance ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
We characterize TiN/Ti/HfO2/TiN memristive devices for neuromorphic computing. We analyze different features that allow the devices to mimic biological synapses and present the models to reproduce analytically some of the data measured. In particular, we have measured the spike timing dependent plasticity behavior in our devices and later on we have modeled it. The spike timing dependent plasticity model was implemented as the learning rule of a spiking neural network that was trained to recognize the MNIST dataset. Variability is implemented and its influence on the network recognition accuracy is considered accounting for the number of neurons in the network and the number of training epochs. Finally, stochastic resonance is studied as another synaptic feature. It is shown that this effect is important and greatly depends on the noise statistical characteristics. more...
- Published
- 2023
- Full Text
- View/download PDF
14. Supervised learning of spatial features with STDP and homeostasis using Spiking Neural Networks on SpiNNaker.
- Author
-
Davies, Sergio, Gait, Andrew, Rowley, Andrew, and Di Nuovo, Alessandro
- Subjects
- *
ARTIFICIAL neural networks , *PATTERN recognition systems , *SUPERVISED learning , *NETWORK performance , *IMAGE analysis - Abstract
Artificial Neural Networks (ANN) have gained significant popularity thanks to their ability to learn using the well-known backpropagation algorithm. Conversely, Spiking Neural Networks (SNNs), despite having broader capabilities than ANNs, have always posed challenges in the training phase. This paper shows a new method to perform supervised learning on SNNs, using Spike Timing Dependent Plasticity (STDP) and homeostasis, aiming at training the network to identify spatial patterns. Spatial patterns refer to spike patterns without a time component, where all spike events occur simultaneously. The method is tested using the SpiNNaker digital architecture. A SNN is trained to recognise one or multiple patterns and performance metrics are extracted to measure the performance of the network. Some considerations are drawn from the results showing that, in the case of a single trained pattern, the network behaves as the ideal detector, with 100% accuracy in detecting the trained pattern. However, as the number of trained patterns on a single network increases, the accuracy of identification is linked to the similarities between these patterns. This method of training an SNN to detect spatial patterns may be applied to pattern recognition in static images or traffic analysis in computer networks, where each network packet represents a spatial pattern. It will be stipulated that the homeostatic factor may enable the network to detect patterns with some degree of similarity, rather than only perfectly matching patterns. The principles outlined in this article serve as the fundamental building blocks for more complex systems that utilise both spatial and temporal patterns by converting specific features of input signals into spikes. One example of such a system is a computer network packet classifier, tasked with real-time identification of packet streams based on features within the packet content. [ABSTRACT FROM AUTHOR] more...
- Published
- 2025
- Full Text
- View/download PDF
15. Continual learning with hebbian plasticity in sparse and predictive coding networks: a survey and perspective
- Author
-
Ali Safa
- Subjects
spiking neural network ,snn ,spike timing dependent plasticity ,STDP ,Hebbian ,continual learning ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Recently, the use of bio-inspired learning techniques such as Hebbian learning and its closely-related spike-timing-dependent plasticity (STDP) variant have drawn significant attention for the design of compute-efficient AI systems that can continuously learn on-line at the edge. A key differentiating factor regarding this emerging class of neuromorphic continual learning system lies in the fact that learning must be carried using a data stream received in its natural order, as opposed to conventional gradient-based offline training, where a static training dataset is assumed available a priori and randomly shuffled to make the training set independent and identically distributed (i.i.d). In contrast, the emerging class of neuromorphic CL systems covered in this survey must learn to integrate new information on the fly in a non-i.i.d manner, which makes these systems subject to catastrophic forgetting. In order to build the next generation of neuromorphic AI systems that can continuously learn at the edge, a growing number of research groups are studying the use of sparse and predictive Coding (PC)-based Hebbian neural network architectures and the related spiking neural networks (SNNs) equipped with STDP learning. However, since this research field is still emerging, there is a need for providing a holistic view of the different approaches proposed in the literature so far. To this end, this survey covers a number of recent works in the field of neuromorphic CL based on state-of-the-art sparse and PC technology; provides background theory to help interested researchers quickly learn the key concepts; and discusses important future research questions in light of the different works covered in this paper. It is hoped that this survey will contribute towards future research in the field of neuromorphic CL. more...
- Published
- 2024
- Full Text
- View/download PDF
16. Heterogeneous recurrent spiking neural network for spatio-temporal classification.
- Author
-
Chakraborty, Biswadeep and Mukhopadhyay, Saibal
- Subjects
ARTIFICIAL neural networks ,RECURRENT neural networks ,ARTIFICIAL intelligence - Abstract
Spiking Neural Networks are often touted as brain-inspired learning models for the third wave of Artificial Intelligence. Although recent SNNs trained with supervised backpropagation show classification accuracy comparable to deep networks, the performance of unsupervised learning-based SNNs remains much lower. This paper presents a heterogeneous recurrent spiking neural network (HRSNN) with unsupervised learning for spatio-temporal classification of video activity recognition tasks on RGB (KTH, UCF11, UCF101) and event-based datasets (DVS128 Gesture). We observed an accuracy of 94.32%for the KTHdataset, 79.58%and 77.53%for theUCF11 and UCF101 datasets, respectively, and an accuracy of 96.54% on the event-based DVS Gesture dataset using the novel unsupervised HRSNN model. The key novelty of the HRSNN is that the recurrent layer in HRSNN consists of heterogeneous neurons with varying firing/relaxation dynamics, and they are trained via heterogeneous spike-time-dependent-plasticity (STDP) with varying learning dynamics for each synapse. We show that this novel combination of heterogeneity in architecture and learning method outperforms current homogeneous spiking neural networks. We further show that HRSNN can achieve similar performance to state-of-the-art backpropagation trained supervised SNN, but with less computation (fewer neurons and sparse connection) and less training data. [ABSTRACT FROM AUTHOR] more...
- Published
- 2023
- Full Text
- View/download PDF
17. Extended Category Learning with Spiking Nets and Spike Timing Dependent Plasticity
- Author
-
Huyck, Christian, Samey, Carlos, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Bramer, Max, editor, and Ellis, Richard, editor more...
- Published
- 2021
- Full Text
- View/download PDF
18. Heterogeneous recurrent spiking neural network for spatio-temporal classification
- Author
-
Biswadeep Chakraborty and Saibal Mukhopadhyay
- Subjects
spiking neural network (SNN) ,action detection and recognition ,spike timing dependent plasticity ,heterogeneity ,unsupervised learning ,Bayesian Optimization (BO) ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Spiking Neural Networks are often touted as brain-inspired learning models for the third wave of Artificial Intelligence. Although recent SNNs trained with supervised backpropagation show classification accuracy comparable to deep networks, the performance of unsupervised learning-based SNNs remains much lower. This paper presents a heterogeneous recurrent spiking neural network (HRSNN) with unsupervised learning for spatio-temporal classification of video activity recognition tasks on RGB (KTH, UCF11, UCF101) and event-based datasets (DVS128 Gesture). We observed an accuracy of 94.32% for the KTH dataset, 79.58% and 77.53% for the UCF11 and UCF101 datasets, respectively, and an accuracy of 96.54% on the event-based DVS Gesture dataset using the novel unsupervised HRSNN model. The key novelty of the HRSNN is that the recurrent layer in HRSNN consists of heterogeneous neurons with varying firing/relaxation dynamics, and they are trained via heterogeneous spike-time-dependent-plasticity (STDP) with varying learning dynamics for each synapse. We show that this novel combination of heterogeneity in architecture and learning method outperforms current homogeneous spiking neural networks. We further show that HRSNN can achieve similar performance to state-of-the-art backpropagation trained supervised SNN, but with less computation (fewer neurons and sparse connection) and less training data. more...
- Published
- 2023
- Full Text
- View/download PDF
19. Unsupervised heart-rate estimation in wearables with Liquid states and a probabilistic readout.
- Author
-
Das, Anup, Pradhapan, Paruthi, Groenendaal, Willemijn, Adiraju, Prathyusha, Rajan, Raj Thilak, Catthoor, Francky, Schaafsma, Siebren, Krichmar, Jeffrey L, Dutt, Nikil, and Van Hoof, Chris
- Subjects
Neurons ,Humans ,Electrocardiography ,Probability ,Action Potentials ,Heart Rate ,Neuronal Plasticity ,Algorithms ,Unsupervised Machine Learning ,Wearable Electronic Devices ,Electrocardiogram ,Fuzzy c-Means clustering ,Homeostatic plasticity ,Liquid state machine ,Spike timing dependent plasticity ,Spiking neural networks ,cs.NE ,cs.LG ,Artificial Intelligence & Image Processing - Abstract
Heart-rate estimation is a fundamental feature of modern wearable devices. In this paper we propose a machine learning technique to estimate heart-rate from electrocardiogram (ECG) data collected using wearable devices. The novelty of our approach lies in (1) encoding spatio-temporal properties of ECG signals directly into spike train and using this to excite recurrently connected spiking neurons in a Liquid State Machine computation model; (2) a novel learning algorithm; and (3) an intelligently designed unsupervised readout based on Fuzzy c-Means clustering of spike responses from a subset of neurons (Liquid states), selected using particle swarm optimization. Our approach differs from existing works by learning directly from ECG signals (allowing personalization), without requiring costly data annotations. Additionally, our approach can be easily implemented on state-of-the-art spiking-based neuromorphic systems, offering high accuracy, yet significantly low energy footprint, leading to an extended battery-life of wearable devices. We validated our approach with CARLsim, a GPU accelerated spiking neural network simulator modeling Izhikevich spiking neurons with Spike Timing Dependent Plasticity (STDP) and homeostatic scaling. A range of subjects is considered from in-house clinical trials and public ECG databases. Results show high accuracy and low energy footprint in heart-rate estimation across subjects with and without cardiac irregularities, signifying the strong potential of this approach to be integrated in future wearable devices. more...
- Published
- 2018
20. STDP Plasticity in TRN Within Hierarchical Spike Timing Model of Visual Information Processing
- Author
-
Koprinkova-Hristova, Petia, Bocheva, Nadejda, Nedelcheva, Simona, Stefanova, Miroslava, Genova, Bilyana, Kraleva, Radoslava, Kralev, Velin, Rannenberg, Kai, Editor-in-Chief, Soares Barbosa, Luís, Editorial Board Member, Goedicke, Michael, Editorial Board Member, Tatnall, Arthur, Editorial Board Member, Neuhold, Erich J., Editorial Board Member, Stiller, Burkhard, Editorial Board Member, Tröltzsch, Fredi, Editorial Board Member, Pries-Heje, Jan, Editorial Board Member, Kreps, David, Editorial Board Member, Reis, Ricardo, Editorial Board Member, Furnell, Steven, Editorial Board Member, Mercier-Laurent, Eunika, Editorial Board Member, Winckler, Marco, Editorial Board Member, Malaka, Rainer, Editorial Board Member, Maglogiannis, Ilias, editor, Iliadis, Lazaros, editor, and Pimenidis, Elias, editor more...
- Published
- 2020
- Full Text
- View/download PDF
21. Learning Precise Spike Timings with Eligibility Traces
- Author
-
Traub, Manuel, Butz, Martin V., Baayen, R. Harald, Otte, Sebastian, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Farkaš, Igor, editor, Masulli, Paolo, editor, and Wermter, Stefan, editor more...
- Published
- 2020
- Full Text
- View/download PDF
22. Real-time execution of SNN models with synaptic plasticity for handwritten digit recognition on SIMD hardware
- Author
-
Universitat Politècnica de Catalunya. Departament d'Enginyeria Electrònica, Universitat Politècnica de Catalunya. IS2- Sensors Intel·ligents i Sistemes Integrats, Vallejo Mancero, Bernardo Javier, Madrenas Boadas, Jordi, Zapata, Mireya, Universitat Politècnica de Catalunya. Departament d'Enginyeria Electrònica, Universitat Politècnica de Catalunya. IS2- Sensors Intel·ligents i Sistemes Integrats, Vallejo Mancero, Bernardo Javier, Madrenas Boadas, Jordi, and Zapata, Mireya more...
- Abstract
Recent advancements in neuromorphic computing have led to the development of hardware architectures inspired by Spiking Neural Networks (SNNs) to emulate the efficiency and parallel processing capabilities of the human brain. This work focuses on testing the HEENS architecture, specifically designed for high parallel processing and biological realism in SNN emulation, implemented on a ZYNQ family FPGA. The study applies this architecture to the classification of digits using the well-known MNIST database. The image resolutions were adjusted to match HEENS' processing capacity. Results were compared with existing work, demonstrating HEENS' performance comparable to other solutions. This study highlights the importance of balancing accuracy and efficiency in the execution of applications. HEENS offers a flexible solution for SNN emulation, allowing for the implementation of programmable neural and synaptic models. It encourages the exploration of novel algorithms and network architectures, providing an alternative for real-time processing with efficient energy consumption., Postprint (published version) more...
- Published
- 2024
23. Multilayer Photonic Spiking Neural Networks: Generalized Supervised Learning Algorithm and Network Optimization.
- Author
-
Fu, Chentao, Xiang, Shuiying, Han, Yanan, Song, Ziwei, and Hao, Yue
- Subjects
MACHINE learning ,SUPERVISED learning ,SURFACE emitting lasers ,MATHEMATICAL optimization ,BREAST cancer ,PROBLEM solving - Abstract
We propose a generalized supervised learning algorithm for multilayer photonic spiking neural networks (SNNs) by combining the spike-timing dependent plasticity (STDP) rule and the gradient descent mechanism. A vertical-cavity surface-emitting laser with an embedded saturable absorber (VCSEL-SA) is employed as a photonic leaky-integrate-and-fire (LIF) neuron. The temporal coding strategy is employed to transform information into the precise firing time. With the modified supervised learning algorithm, the trained multilayer photonic SNN successfully solves the XOR problem and performs well on the Iris and Wisconsin breast cancer datasets. This indicates that a generalized supervised learning algorithm is realized for multilayer photonic SNN. In addition, network optimization is performed by considering different network sizes. [ABSTRACT FROM AUTHOR] more...
- Published
- 2022
- Full Text
- View/download PDF
24. Tutorial on Stochastic Computing
- Author
-
Winstead, Chris, Gross, Warren J., editor, and Gaudet, Vincent C., editor
- Published
- 2019
- Full Text
- View/download PDF
25. Model reduction for stochastic CaMKII reaction kinetics in synapses by graph-constrained correlation dynamics.
- Author
-
Johnson, Todd, Bartol, Tom, Sejnowski, Terrence, and Mjolsness, Eric
- Subjects
Synapses ,Calcium ,Calmodulin ,Probability ,Kinetics ,Algorithms ,Models ,Neurological ,Models ,Chemical ,Calcium-Calmodulin-Dependent Protein Kinase Type 2 ,Molecular Dynamics Simulation ,model reduction ,stochastic reaction networks ,rule-based modeling ,graph-constrained correlation dynamics ,Boltzmann learning ,CaMKII ,spike timing dependent plasticity ,Models ,Neurological ,Chemical ,Biophysics ,Engineering ,Physical Sciences ,Biological Sciences - Abstract
A stochastic reaction network model of Ca(2+) dynamics in synapses (Pepke et al PLoS Comput. Biol. 6 e1000675) is expressed and simulated using rule-based reaction modeling notation in dynamical grammars and in MCell. The model tracks the response of calmodulin and CaMKII to calcium influx in synapses. Data from numerically intensive simulations is used to train a reduced model that, out of sample, correctly predicts the evolution of interaction parameters characterizing the instantaneous probability distribution over molecular states in the much larger fine-scale models. The novel model reduction method, 'graph-constrained correlation dynamics', requires a graph of plausible state variables and interactions as input. It parametrically optimizes a set of constant coefficients appearing in differential equations governing the time-varying interaction parameters that determine all correlations between variables in the reduced model at any time slice. more...
- Published
- 2015
26. Spiking neural networks compensate for weight drift in organic neuromorphic device networks
- Author
-
Daniel Felder, John Linkhorst, and Matthias Wessling
- Subjects
neuromorphic computing ,spiking neural network ,spike timing dependent plasticity ,organic electronics ,algorithm-hardware co-design ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Organic neuromorphic devices can accelerate neural networks and integrate with biological systems. Devices based on the biocompatible and conductive polymer PEDOT:PSS are fast, require low amounts of energy and perform well in crossbar simulations. However, parasitic electrochemical reactions lead to self-discharge and the fading of the learned conductance states over time. This limits a neural network’s operating time and requires complex compensation mechanisms. Spiking neural networks (SNNs) take inspiration from biology to implement local and always-on learning. We show that these SNNs can function on organic neuromorphic hardware and compensate for self-discharge by continuously relearning and reinforcing forgotten states. In this work, we use a high-resolution charge transport model to describe the behavior of organic neuromorphic devices and create a computationally efficient surrogate model. By integrating the surrogate model into a Brian 2 simulation, we can describe the behavior of SNNs on organic neuromorphic hardware. A biologically plausible two-layer network for recognizing $28\times28$ pixel MNIST images is trained and observed during self-discharge. The network achieves, for its size, competitive recognition results of up to 82.5%. Building a network with forgetful devices yields superior accuracy during training with 84.5% compared to ideal devices. However, trained networks without active spike-timing-dependent plasticity quickly lose their predictive performance. We show that online learning can keep the performance at a steady level close to the initial accuracy, even for idle rates of up to 90%. This performance is maintained when the output neuron’s labels are not revalidated for up to 24 h. These findings reconfirm the potential of organic neuromorphic devices for brain-inspired computing. Their biocompatibility and the demonstrated adaptability to SNNs open the path towards close integration with multi-electrode arrays, drug-delivery devices, and other bio-interfacing systems as either fully organic or hybrid organic-inorganic systems. more...
- Published
- 2023
- Full Text
- View/download PDF
27. Toward Learning in Neuromorphic Circuits Based on Quantum Phase Slip Junctions
- Author
-
Ran Cheng, Uday S. Goteti, Harrison Walker, Keith M. Krause, Luke Oeding, and Michael C. Hamilton
- Subjects
quantum phase slip junction ,Josephson junction ,neuromorphic computing ,spike timing dependent plasticity ,unsupervised learning ,coupled synapse networks ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
We explore the use of superconducting quantum phase slip junctions (QPSJs), an electromagnetic dual to Josephson Junctions (JJs), in neuromorphic circuits. These small circuits could serve as the building blocks of neuromorphic circuits for machine learning applications because they exhibit desirable properties such as inherent ultra-low energy per operation, high speed, dense integration, negligible loss, and natural spiking responses. In addition, they have a relatively straight-forward micro/nano fabrication, which shows promise for implementation of an enormous number of lossless interconnections that are required to realize complex neuromorphic systems. We simulate QPSJ-only, as well as hybrid QPSJ + JJ circuits for application in neuromorphic circuits including artificial synapses and neurons, as well as fan-in and fan-out circuits. We also design and simulate learning circuits, where a simplified spike timing dependent plasticity rule is realized to provide potential learning mechanisms. We also take an alternative approach, which shows potential to overcome some of the expected challenges of QPSJ-based neuromorphic circuits, via QPSJ-based charge islands coupled together to generate non-linear charge dynamics that result in a large number of programmable weights or non-volatile memory states. Notably, we show that these weights are a function of the timing and frequency of the input spiking signals and can be programmed using a small number of DC voltage bias signals, therefore exhibiting spike-timing and rate dependent plasticity, which are mechanisms to realize learning in neuromorphic circuits. more...
- Published
- 2021
- Full Text
- View/download PDF
28. Dopaminergic Neuromodulation of Spike Timing Dependent Plasticity in Mature Adult Rodent and Human Cortical Neurons
- Author
-
Emma Louise Louth, Rasmus Langelund Jørgensen, Anders Rosendal Korshoej, Jens Christian Hedemann Sørensen, and Marco Capogna
- Subjects
dopamine ,human cortical slices ,layer 5 pyramidal neurons ,spike timing dependent plasticity ,synaptic inhibition ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Synapses in the cerebral cortex constantly change and this dynamic property regulated by the action of neuromodulators such as dopamine (DA), is essential for reward learning and memory. DA modulates spike-timing-dependent plasticity (STDP), a cellular model of learning and memory, in juvenile rodent cortical neurons. However, it is unknown whether this neuromodulation also occurs at excitatory synapses of cortical neurons in mature adult mice or in humans. Cortical layer V pyramidal neurons were recorded with whole cell patch clamp electrophysiology and an extracellular stimulating electrode was used to induce STDP. DA was either bath-applied or optogenetically released in slices from mice. Classical STDP induction protocols triggered non-hebbian excitatory synaptic depression in the mouse or no plasticity at human cortical synapses. DA reverted long term synaptic depression to baseline in mouse via dopamine 2 type receptors or elicited long term synaptic potentiation in human cortical synapses. Furthermore, when DA was applied during an STDP protocol it depressed presynaptic inhibition in the mouse but not in the human cortex. Thus, DA modulates excitatory synaptic plasticity differently in human vs. mouse cortex. The data strengthens the importance of DA in gating cognition in humans, and may inform on therapeutic interventions to recover brain function from diseases. more...
- Published
- 2021
- Full Text
- View/download PDF
29. Dopaminergic Neuromodulation of Spike Timing Dependent Plasticity in Mature Adult Rodent and Human Cortical Neurons.
- Author
-
Louth, Emma Louise, Jørgensen, Rasmus Langelund, Korshoej, Anders Rosendal, Sørensen, Jens Christian Hedemann, and Capogna, Marco
- Subjects
PATCH-clamp techniques (Electrophysiology) ,DOPAMINERGIC neurons ,PYRAMIDAL neurons ,RODENTS ,REWARD (Psychology) ,NEURONS - Abstract
Synapses in the cerebral cortex constantly change and this dynamic property regulated by the action of neuromodulators such as dopamine (DA), is essential for reward learning and memory. DA modulates spike-timing-dependent plasticity (STDP), a cellular model of learning and memory, in juvenile rodent cortical neurons. However, it is unknown whether this neuromodulation also occurs at excitatory synapses of cortical neurons in mature adult mice or in humans. Cortical layer V pyramidal neurons were recorded with whole cell patch clamp electrophysiology and an extracellular stimulating electrode was used to induce STDP. DA was either bath-applied or optogenetically released in slices from mice. Classical STDP induction protocols triggered non-hebbian excitatory synaptic depression in the mouse or no plasticity at human cortical synapses. DA reverted long term synaptic depression to baseline in mouse via dopamine 2 type receptors or elicited long term synaptic potentiation in human cortical synapses. Furthermore, when DA was applied during an STDP protocol it depressed presynaptic inhibition in the mouse but not in the human cortex. Thus, DA modulates excitatory synaptic plasticity differently in human vs. mouse cortex. The data strengthens the importance of DA in gating cognition in humans, and may inform on therapeutic interventions to recover brain function from diseases. [ABSTRACT FROM AUTHOR] more...
- Published
- 2021
- Full Text
- View/download PDF
30. Memory stability and synaptic plasticity
- Author
-
Billings, Guy, van Rossum, Mark., and Morris, Richard
- Subjects
612.8 ,synaptic plasticity ,learning ,memory ,Spike timing dependent plasticity - Abstract
Numerous experiments have demonstrated that the activity of neurons can alter the strength of excitatory synapses. This synaptic plasticity is bidirectional and synapses can be strengthened (potentiation) or weakened (depression). Synaptic plasticity offers a mechanism that links the ongoing activity of the brain with persistent physical changes to its structure. For this reason it is widely believed that synaptic plasticity mediates learning and memory. The hypothesis that synapses store memories by modifying their strengths raises an important issue. There should be a balance between the necessity that synapses change frequently, allowing new memories to be stored with high fidelity, and the necessity that synapses retain previously stored information. This is the plasticity stability dilemma. In this thesis the plasticity stability dilemma is studied in the context of the two dominant paradigms of activity dependent synaptic plasticity: Spike timing dependent plasticity (STDP) and long term potentiation and depression (LTP/D). Models of biological synapses are analysed and processes that might ameliorate the plasticity stability dilemma are identified. Two popular existing models of STDP are compared. Through this comparison it is demonstrated that the synaptic weight dynamics of STDP has a large impact upon the retention time of correlation between the weights of a single neuron and a memory. In networks it is shown that lateral inhibition stabilises the synaptic weights and receptive fields. To analyse LTP a novel model of LTP/D is proposed. The model centres on the distinction between early LTP/D, when synaptic modifications are persistent on a short timescale, and late LTP/D when synaptic modifications are persistent on a long timescale. In the context of the hippocampus it is proposed that early LTP/D allows the rapid and continuous storage of short lasting memory traces over a long lasting trace established with late LTP/D. It is shown that this might confer a longer memory retention time than in a system with only one phase of LTP/D. Experimental predictions about the dynamics of amnesia based upon this model are proposed. Synaptic tagging is a phenomenon whereby early LTP can be converted into late LTP, by subsequent induction of late LTP in a separate but nearby input. Synaptic tagging is incorporated into the LTP/D framework. Using this model it is demonstrated that synaptic tagging could lead to the conversion of a short lasting memory trace into a longer lasting trace. It is proposed that this allows the rescue of memory traces that were initially destined for complete decay. When combined with early and late LTP/D iii synaptic tagging might allow the management of hippocampal memory traces, such that not all memories must be stored on the longest, most stable late phase timescale. This lessens the plasticity stability dilemma in the hippocampus, where it has been hypothesised that memory traces must be frequently and vividly formed, but that not all traces demand eventual consolidation at the systems level. more...
- Published
- 2009
31. Is Neuromorphic MNIST Neuromorphic? Analyzing the Discriminative Power of Neuromorphic Datasets in the Time Domain
- Author
-
Laxmi R. Iyer, Yansong Chua, and Haizhou Li
- Subjects
spiking neural network ,spike timing dependent plasticity ,N-MNIST dataset ,neuromorphic benchmark ,spike time coding ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
A major characteristic of spiking neural networks (SNNs) over conventional artificial neural networks (ANNs) is their ability to spike, enabling them to use spike timing for coding and efficient computing. In this paper, we assess if neuromorphic datasets recorded from static images are able to evaluate the ability of SNNs to use spike timings in their calculations. We have analyzed N-MNIST, N-Caltech101 and DvsGesture along these lines, but focus our study on N-MNIST. First we evaluate if additional information is encoded in the time domain in a neuromorphic dataset. We show that an ANN trained with backpropagation on frame-based versions of N-MNIST and N-Caltech101 images achieve 99.23 and 78.01% accuracy. These are comparable to the state of the art—showing that an algorithm that purely works on spatial data can classify these datasets. Second we compare N-MNIST and DvsGesture on two STDP algorithms, RD-STDP, that can classify only spatial data, and STDP-tempotron that classifies spatiotemporal data. We demonstrate that RD-STDP performs very well on N-MNIST, while STDP-tempotron performs better on DvsGesture. Since DvsGesture has a temporal dimension, it requires STDP-tempotron, while N-MNIST can be adequately classified by an algorithm that works on spatial data alone. This shows that precise spike timings are not important in N-MNIST. N-MNIST does not, therefore, highlight the ability of SNNs to classify temporal data. The conclusions of this paper open the question—what dataset can evaluate SNN ability to classify temporal data? more...
- Published
- 2021
- Full Text
- View/download PDF
32. Learning Distance-Behavioural Preferences Using a Single Sensor in a Spiking Neural Network
- Author
-
Ross, Matt, Berberian, Nareg, Cyr, André, Thériault, Frédéric, Chartier, Sylvain, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Lintas, Alessandra, editor, Rovetta, Stefano, editor, Verschure, Paul F.M.J., editor, and Villa, Alessandro E.P., editor more...
- Published
- 2017
- Full Text
- View/download PDF
33. Computational Neuroscience Offers Hints for More General Machine Learning
- Author
-
Rawlinson, David, Kowadlo, Gideon, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Everitt, Tom, editor, Goertzel, Ben, editor, and Potapov, Alexey, editor more...
- Published
- 2017
- Full Text
- View/download PDF
34. Is Neuromorphic MNIST Neuromorphic? Analyzing the Discriminative Power of Neuromorphic Datasets in the Time Domain.
- Author
-
Iyer, Laxmi R., Chua, Yansong, and Li, Haizhou
- Subjects
ARTIFICIAL neural networks ,TIME management - Abstract
A major characteristic of spiking neural networks (SNNs) over conventional artificial neural networks (ANNs) is their ability to spike, enabling them to use spike timing for coding and efficient computing. In this paper, we assess if neuromorphic datasets recorded from static images are able to evaluate the ability of SNNs to use spike timings in their calculations. We have analyzed N-MNIST, N-Caltech101 and DvsGesture along these lines, but focus our study on N-MNIST. First we evaluate if additional information is encoded in the time domain in a neuromorphic dataset. We show that an ANN trained with backpropagation on frame-based versions of N-MNIST and N-Caltech101 images achieve 99.23 and 78.01% accuracy. These are comparable to the state of the art—showing that an algorithm that purely works on spatial data can classify these datasets. Second we compare N-MNIST and DvsGesture on two STDP algorithms, RD-STDP, that can classify only spatial data, and STDP-tempotron that classifies spatiotemporal data. We demonstrate that RD-STDP performs very well on N-MNIST, while STDP-tempotron performs better on DvsGesture. Since DvsGesture has a temporal dimension, it requires STDP-tempotron, while N-MNIST can be adequately classified by an algorithm that works on spatial data alone. This shows that precise spike timings are not important in N-MNIST. N-MNIST does not, therefore, highlight the ability of SNNs to classify temporal data. The conclusions of this paper open the question—what dataset can evaluate SNN ability to classify temporal data? [ABSTRACT FROM AUTHOR] more...
- Published
- 2021
- Full Text
- View/download PDF
35. Multilayer Photonic Spiking Neural Networks: Generalized Supervised Learning Algorithm and Network Optimization
- Author
-
Chentao Fu, Shuiying Xiang, Yanan Han, Ziwei Song, and Yue Hao
- Subjects
photonic spiking neural network ,multilayer spiking neural network ,supervised learning ,vertical-cavity surface-emitting lasers ,spike timing dependent plasticity ,Applied optics. Photonics ,TA1501-1820 - Abstract
We propose a generalized supervised learning algorithm for multilayer photonic spiking neural networks (SNNs) by combining the spike-timing dependent plasticity (STDP) rule and the gradient descent mechanism. A vertical-cavity surface-emitting laser with an embedded saturable absorber (VCSEL-SA) is employed as a photonic leaky-integrate-and-fire (LIF) neuron. The temporal coding strategy is employed to transform information into the precise firing time. With the modified supervised learning algorithm, the trained multilayer photonic SNN successfully solves the XOR problem and performs well on the Iris and Wisconsin breast cancer datasets. This indicates that a generalized supervised learning algorithm is realized for multilayer photonic SNN. In addition, network optimization is performed by considering different network sizes. more...
- Published
- 2022
- Full Text
- View/download PDF
36. Bihemispheric Cerebellar Spiking Network Model to Simulate Acute VOR Motor Learning
- Author
-
Inagaki, Keiichiro, Hirata, Yutaka, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Hirose, Akira, editor, Ozawa, Seiichi, editor, Doya, Kenji, editor, Ikeda, Kazushi, editor, Lee, Minho, editor, and Liu, Derong, editor more...
- Published
- 2016
- Full Text
- View/download PDF
37. Creation through Polychronization
- Author
-
John Matthias
- Subjects
collaboration ,composition ,polychronization ,spike timing dependent plasticity ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 ,Philosophy (General) ,B1-5802 - Abstract
I have recently suggested that some of the processes involved in the collaborative composition of new music could be analogous to several ideas introduced by Izhikevich in his theory of cortical spiking neurons and simple memory, a process which he calls Polychronization. In the Izhikevich model, the evocation of simple memories is achieved by the sequential re-firing of the same Polychronous group of neurons which was initially created in the cerebral cortex by the sensual stimulus. Each firing event within the group is contingent upon the previous firing event and, in particular, contingent upon the timing of the firings, due to a phenomenon known as “Spike Timing Dependent Plasticity.” I argue in this article that the collaborative creation of new music involves contingencies which form a Polychronous group across space and time which helps to create a temporary shared memorial space between the collaborators. more...
- Published
- 2017
- Full Text
- View/download PDF
38. Effect of spike-timing dependent plasticity rule choice on memory capacity and form in spiking neural networks
- Author
-
Tatsuno, Masami, Arthur, Derek, University of Lethbridge. Faculty of Arts and Science, Tatsuno, Masami, Arthur, Derek, and University of Lethbridge. Faculty of Arts and Science
- Abstract
The strengthening of synapses between coactivating neurons is believed to be an important underlying mechanism for learning and memory. Hebbian learning of this type has been observed in the brain, with the degree of synaptic strength change dependent on the relative timing of pre-spike arrival and post-spike emission called spike timing dependent plasticity (STDP). Another important feature of learning and memory is the existence of neural spike-timing patterns. Early work by Izhikevich (2006) argued that STDP spontaneously produces structures known as polychronous groups, defined by the network connectivity, that can produce such patterns. However, studies involving STDP face two important issues: how the STDP rule distributes synaptic weights and not knowing what STDP rule is used in the brain. This highlights the importance of understanding the fundamental properties of different STDP rules to determine their effect on the outcome of computational studies. This study focuses on the comparison of two STDP rules, one used by Izhikevich (2006), add-STDP, that produces a bimodal weight distribution, and log-STDP which produces a lognormal weight distribution. The comparison made is between the number of polychronous groups produced and the number of spike-timing patterns, or cell ensembles, found with another detection method that is applicable to experimental data. The number of polychronous groups found with add-STDP was significantly larger as were their sizes and durations. In contrast, the number of cell ensembles found in log-STDP was considerably larger, however, sizes and lifetimes were comparable. Lastly, the activity of cell ensembles in the log-STDP simulations has a non-trivial relationship with the dynamics of synaptic weights in the network, whereas no relationship was found for add-STDP. more...
- Published
- 2023
39. STDP Produces Well Behaved Oscillations and Synchrony
- Author
-
Bhowmik, David, Shanahan, Murray, Rubin, Wang, Series editor, and Liljenström, Hans, editor
- Published
- 2015
- Full Text
- View/download PDF
40. Active Electroreception in Weakly Electric Fish
- Author
-
Caputi, Angel Ariel
- Published
- 2017
- Full Text
- View/download PDF
41. Controlled Forgetting: Targeted Stimulation and Dopaminergic Plasticity Modulation for Unsupervised Lifelong Learning in Spiking Neural Networks.
- Author
-
Allred, Jason M. and Roy, Kaushik
- Subjects
CONTINUING education ,MEMORY loss ,DATA distribution ,DOPAMINE receptors - Abstract
Stochastic gradient descent requires that training samples be drawn from a uniformly random distribution of the data. For a deployed system that must learn online from an uncontrolled and unknown environment, the ordering of input samples often fails to meet this criterion, making lifelong learning a difficult challenge. We exploit the locality of the unsupervised Spike Timing Dependent Plasticity (STDP) learning rule to target local representations in a Spiking Neural Network (SNN) to adapt to novel information while protecting essential information in the remainder of the SNN from catastrophic forgetting. In our Controlled Forgetting Networks (CFNs), novel information triggers stimulated firing and heterogeneously modulated plasticity, inspired by biological dopamine signals, to cause rapid and isolated adaptation in the synapses of neurons associated with outlier information. This targeting controls the forgetting process in a way that reduces the degradation of accuracy for older tasks while learning new tasks. Our experimental results on the MNIST dataset validate the capability of CFNs to learn successfully over time from an unknown, changing environment, achieving 95.24% accuracy, which we believe is the best unsupervised accuracy ever achieved by a fixed-size, single-layer SNN on a completely disjoint MNIST dataset. [ABSTRACT FROM AUTHOR] more...
- Published
- 2020
- Full Text
- View/download PDF
42. STDP-Based Unsupervised Spike Pattern Learning in a Photonic Spiking Neural Network With VCSELs and VCSOAs.
- Author
-
Xiang, Shuiying, Zhang, Yahui, Gong, Junkai, Guo, Xingxing, Lin, Lin, and Hao, Yue
- Abstract
We propose a photonic spiking neural network (SNN) consisting of photonic spiking neurons based on vertical-cavity surface-emitting lasers (VCSELs). The photonic spike timing dependent plasticity (STDP) is implemented in a vertical-cavity semiconductor optical amplifier (VCSOA). A versatile computational model of the photonic SNN is presented based on the rate equation models. Through numerical simulation, a spike pattern learning and recognition task is performed based on the photonic STDP. The results show that the post-synaptic spike timing (PST) is eventually converged iteratively to the first spike timing of the input spike pattern via unsupervised learning. Additionally, the convergence rate of the PST can be accelerated for a photonic SNN with more pre-synaptic neurons. The effects of VCSOA parameters on the convergence performance of the unsupervised spike learning are also considered. To the best of our knowledge, such a versatile computational model of photonic SNN for unsupervised learning and recognition of arbitrary spike pattern has not yet been reported, which would contribute one step forward toward numerical implementation of a large-scale energy-efficient photonic SNN, and hence is interesting for neuromorphic photonic systems and spiking information processing. [ABSTRACT FROM AUTHOR] more...
- Published
- 2019
- Full Text
- View/download PDF
43. Lightweight spiking neural network training based on spike timing dependent backpropagation.
- Author
-
Gong, Yu, Chen, Tao, Wang, Shu, Duan, Shukai, and Wang, Lidan
- Subjects
- *
ARTIFICIAL neural networks , *BIOENERGETICS , *WEIGHT training , *ENERGY consumption - Abstract
Spiking neural networks are energy efficient and biological interpretability, communicating through sparse, asynchronous spikes, which makes them suitable for neuromorphic hardware. However, due to the nature of binary weights and spike trains in time-coded binarized spiking neural networks, their forward propagation may cause neurons to not fire spikes, and their backward propagation has non-differentiable problems. Moreover, the current use of deep and complex network structures generates a large number of redundant parameters. Therefore, we need effective methods to improve energy efficiency without reducing accuracy. We propose a dynamic threshold model that can reduce the number of dead neurons. We combine the backpropagation algorithm and the spike timing dependent plasticity algorithm to avoid the non-differentiable problem. We propose a neuron pruning strategy based on adaptive firing time threshold. This pruning strategy prunes 267 neurons in a network of 600 neurons, reducing the network size and obtaining a more compact network structure. The energy efficiency is improved by 0.55 × , while the classification accuracy is lost by 1.1%. [ABSTRACT FROM AUTHOR] more...
- Published
- 2024
- Full Text
- View/download PDF
44. Designing Behaviour in Bio-inspired Robots Using Associative Topologies of Spiking-Neural-Networks
- Author
-
Cristian Jimenez-Romero, David Sousa-Rodrigues, and Jeffrey Johnson
- Subjects
spiking neurons ,spike timing dependent plasticity ,associative learning ,robotics ,agents simulation ,articial life ,Technology - Abstract
This study explores the design and control of the behaviour of agents and robots using simple circuits of spiking neurons and Spike Timing Dependent Plasticity (STDP) as a mechanism of associative and unsupervised learning. Based on a "reward and punishment" classical conditioning, it is demonstrated that these robots learnt to identify and avoid obstacles as well as to identify and look for rewarding stimuli. Using the simulation and programming environment NetLogo, a software engine for the Integrate and Fire model was developed, which allowed us to monitor in discrete time steps the dynamics of each single neuron, synapse and spike in the proposed neural networks. These spiking neural networks (SNN) served as simple brains for the experimental robots. The Lego Mindstorms robot kit was used for the embodiment of the simulated agents. In this paper the topological building blocks are presented as well as the neural parameters required to reproduce the experiments. This paper summarizes the resulting behaviour as well as the observed dynamics of the neural circuits. The Internet-link to the NetLogo code is included in the annex. more...
- Published
- 2016
- Full Text
- View/download PDF
45. A Robotic Simulation Framework for Cognitive Systems
- Author
-
Arena, P., Patanè, L., Vitanza, A., Dillmann, Rüdiger, Series editor, Nakamura, Yoshihiko, Series editor, Schaal, Stefan, Series editor, Vernon, David, Series editor, Arena, Paolo, editor, and Patanè, Luca, editor more...
- Published
- 2014
- Full Text
- View/download PDF
46. An Artificial Neural Network Based on the Architecture of the Cerebellum for Behavior Learning
- Author
-
Iwadate, Kenji, Suzuki, Ikuo, Watanabe, Michiko, Yamamoto, Masahito, Furukawa, Masashi, Kacprzyk, Janusz, Series editor, Cho, Young Im, editor, and Matson, Eric T., editor
- Published
- 2014
- Full Text
- View/download PDF
47. Memristors and Memristive Devices for Neuromorphic Computing
- Author
-
Sheridan, Patrick, Lu, Wei, Adamatzky, Andrew, editor, and Chua, Leon, editor
- Published
- 2014
- Full Text
- View/download PDF
48. On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights
- Author
-
Amirreza Yousefzadeh, Evangelos Stromatias, Miguel Soto, Teresa Serrano-Gotarredona, and Bernabé Linares-Barranco
- Subjects
spiking neural networks ,spike timing dependent plasticity ,stochastic learning ,feature extraction ,neuromorphic systems ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
In computational neuroscience, synaptic plasticity learning rules are typically studied using the full 64-bit floating point precision computers provide. However, for dedicated hardware implementations, the precision used not only penalizes directly the required memory resources, but also the computing, communication, and energy resources. When it comes to hardware engineering, a key question is always to find the minimum number of necessary bits to keep the neurocomputational system working satisfactorily. Here we present some techniques and results obtained when limiting synaptic weights to 1-bit precision, applied to a Spike-Timing-Dependent-Plasticity (STDP) learning rule in Spiking Neural Networks (SNN). We first illustrate the 1-bit synapses STDP operation by replicating a classical biological experiment on visual orientation tuning, using a simple four neuron setup. After this, we apply 1-bit STDP learning to the hidden feature extraction layer of a 2-layer system, where for the second (and output) layer we use already reported SNN classifiers. The systems are tested on two spiking datasets: a Dynamic Vision Sensor (DVS) recorded poker card symbols dataset and a Poisson-distributed spike representation MNIST dataset version. Tests are performed using the in-house MegaSim event-driven behavioral simulator and by implementing the systems on FPGA (Field Programmable Gate Array) hardware. more...
- Published
- 2018
- Full Text
- View/download PDF
49. Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning
- Author
-
Chankyu Lee, Priyadarshini Panda, Gopalakrishnan Srinivasan, and Kaushik Roy
- Subjects
spiking neural network ,convolutional neural network ,spike-based learning rule ,spike timing dependent plasticity ,gradient descent backpropagation ,leaky integrate and fire neuron ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Spiking Neural Networks (SNNs) are fast becoming a promising candidate for brain-inspired neuromorphic computing because of their inherent power efficiency and impressive inference accuracy across several cognitive tasks such as image classification and speech recognition. The recent efforts in SNNs have been focused on implementing deeper networks with multiple hidden layers to incorporate exponentially more difficult functional representations. In this paper, we propose a pre-training scheme using biologically plausible unsupervised learning, namely Spike-Timing-Dependent-Plasticity (STDP), in order to better initialize the parameters in multi-layer systems prior to supervised optimization. The multi-layer SNN is comprised of alternating convolutional and pooling layers followed by fully-connected layers, which are populated with leaky integrate-and-fire spiking neurons. We train the deep SNNs in two phases wherein, first, convolutional kernels are pre-trained in a layer-wise manner with unsupervised learning followed by fine-tuning the synaptic weights with spike-based supervised gradient descent backpropagation. Our experiments on digit recognition demonstrate that the STDP-based pre-training with gradient-based optimization provides improved robustness, faster (~2.5 ×) training time and better generalization compared with purely gradient-based training without pre-training. more...
- Published
- 2018
- Full Text
- View/download PDF
50. Inhibitory Network Dependency in Cantor Coding
- Author
-
Fukushima, Yasuhiro, Isomura, Yoshikazu, Yamaguti, Yutaka, Kuroda, Shigeru, Tsuda, Ichiro, Tsukada, Minoru, and Yamaguchi, Yoko, editor
- Published
- 2013
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.