1,077 results on '"hebbian learning"'
Search Results
2. The effects of implementing phenomenology in a deep neural network
- Author
-
Bensemann, Joshua and Witbrock, Michael
- Published
- 2021
- Full Text
- View/download PDF
3. Can a Hebbian-like learning rule be avoiding the curse of dimensionality in sparse distributed data?
- Author
-
Osório, Maria, Sa-Couto, Luis, and Wichert, Andreas
- Subjects
- *
MACHINE learning , *BOLTZMANN machine , *MACHINE performance , *PROBLEM solving , *INTUITION - Abstract
It is generally assumed that the brain uses something akin to sparse distributed representations. These representations, however, are high-dimensional and consequently they affect classification performance of traditional Machine Learning models due to the "curse of dimensionality". In tasks for which there is a vast amount of labeled data, Deep Networks seem to solve this issue with many layers and a non-Hebbian backpropagation algorithm. The brain, however, seems to be able to solve the problem with few layers. In this work, we hypothesize that this happens by using Hebbian learning. Actually, the Hebbian-like learning rule of Restricted Boltzmann Machines learns the input patterns asymmetrically. It exclusively learns the correlation between non-zero values and ignores the zeros, which represent the vast majority of the input dimensionality. By ignoring the zeros the "curse of dimensionality" problem can be avoided. To test our hypothesis, we generated several sparse datasets and compared the performance of a Restricted Boltzmann Machine classifier with some Backprop-trained networks. The experiments using these codes confirm our initial intuition as the Restricted Boltzmann Machine shows a good generalization performance, while the Neural Networks trained with the backpropagation algorithm overfit the training data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Hebbian Learning with Kernel-Based Embedding of Input Data.
- Author
-
Ushikoshi, Thiago A., Freitas, Elias J. R., Menezes, Murilo, Junior, Wagner J. A., Torres, Luiz C. B., and Braga, Antonio P.
- Abstract
Although it requires simple computations, provides good performance on linear classification tasks and offers a suitable environment for active learning strategies, the Hebbian learning rule is very sensitive to how the training data relate to each other in the input space. Since this spatial arrangement is inherent to each set of samples, the practical application of this learning paradigm is limited. Thus, representation learning may play an important role in projecting the input data into a new space where linear separability is improved. Earlier methods based on orthogonal coding addressed this issue but presented many side effects, impoverishing the generalization of the model. Hence, this paper considers a recently proposed method based on kernel density estimators, which performs a likelihood-based projection where linear separability and generalization capacity are enhanced in an autonomous fashion. Results show that this novel method allows one to use linear classifiers to solve many binary classification problems and overcome the performance of well-established classifiers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Spiking representation learning for associative memories.
- Author
-
Ravichandran, Naresh, Lansner, Anders, and Herman, Pawel
- Subjects
ARTIFICIAL neural networks ,ASSOCIATIVE learning ,DEEP learning ,PROBLEM solving ,LEARNING ability - Abstract
Networks of interconnected neurons communicating through spiking signals offer the bedrock of neural computations. Our brain's spiking neural networks have the computational capacity to achieve complex pattern recognition and cognitive functions effortlessly. However, solving real-world problems with artificial spiking neural networks (SNNs) has proved to be difficult for a variety of reasons. Crucially, scaling SNNs to large networks and processing large-scale real-world datasets have been challenging, especially when compared to their non-spiking deep learning counterparts. The critical operation that is needed of SNNs is the ability to learn distributed representations from data and use these representations for perceptual, cognitive and memory operations. In this work, we introduce a novel SNN that performs unsupervised representation learning and associative memory operations leveraging Hebbian synaptic and activity- dependent structural plasticity coupled with neuron-units modelled as Poisson spike generators with sparse firing (~1Hz mean and ~100 Hz maximum firing rate). Crucially, the architecture of our model derives from the neocortical columnar organization and combines feed forward projections for learning hidden representations and recurrent projections for forming associative memories. We evaluated the model on properties relevant for attractor-based associative memories such as pattern completion, perceptual rivalry, distortion resistance, and prototype extraction. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Recall tempo of Hebbian sequences depends on the interplay of Hebbian kernel with tutor signal timing.
- Author
-
Farrell, Matthew and Pehlevan, Cengiz
- Subjects
- *
SEQUENTIAL circuits , *MOTOR learning , *ARTIFICIAL intelligence , *NEUROPLASTICITY , *NEURAL circuitry - Abstract
Understanding how neural circuits generate sequential activity is a longstanding challenge. While foundational theoretical models have shown how sequences can be stored as memories in neural networks with Hebbian plasticity rules, these models considered only a narrow range of Hebbian rules. Here, we introduce a model for arbitrary Hebbian plasticity rules, capturing the diversity of spike-timing-dependent synaptic plasticity seen in experiments, and show how the choice of these rules and of neural activity patterns influences sequence memory formation and retrieval. In particular, we derive a general theory that predicts the tempo of sequence replay. This theory lays a foundation for explaining how cortical tutor signals might give rise to motor actions that eventually become "automatic." Our theory also captures the impact of changing the tempo of the tutor signal. Beyond shedding light on biological circuits, this theory has relevance in artificial intelligence by laying a foundation for frameworks whereby slow and computationally expensive deliberation can be stored as memories and eventually replaced by inexpensive recall. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Annotate and retrieve in vivo images using hybrid self-organizing map.
- Author
-
Kaur, Parminder, Malhi, Avleen, and Pannu, Husanbir
- Subjects
- *
SELF-organizing maps , *TEXT recognition , *CONTENT-based image retrieval , *ASSOCIATIVE learning - Abstract
Multimodal retrieval has gained much attention lately due to its effectiveness over uni-modal retrieval. For instance, visual features often under-constrain the description of an image in content-based retrieval; however, another modality, such as collateral text, can be introduced to abridge the semantic gap and make the retrieval process more efficient. This article proposes the application of cross-modal fusion and retrieval on real in vivo gastrointestinal images and linguistic cues, as the visual features alone are insufficient for image description and to assist gastroenterologists. So, a cross-modal information retrieval approach has been proposed to retrieve related images given text and vice versa while handling the heterogeneity gap issue among the modalities. The technique comprises two stages: (1) individual modality feature learning; and (2) fusion of two trained networks. In the first stage, two self-organizing maps (SOMs) are trained separately using images and texts, which are clustered in the respective SOMs based on their similarity. In the second (fusion) stage, the trained SOMs are integrated using an associative network to enable cross-modal retrieval. The underlying learning techniques of the associative network include Hebbian learning and Oja learning (Improved Hebbian learning). The introduced framework can annotate images with keywords and illustrate keywords with images, and it can also be extended to incorporate more diverse modalities. Extensive experimentation has been performed on real gastrointestinal images obtained from a known gastroenterologist that have collateral keywords with each image. The obtained results proved the efficacy of the algorithm and its significance in aiding gastroenterologists in quick and pertinent decision making. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Robustness of Biologically Grounded Neural Networks Against Image Perturbations
- Author
-
Teichmann, Michael, Larisch, René, Hamker, Fred H., Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Wand, Michael, editor, Malinovská, Kristína, editor, Schmidhuber, Jürgen, editor, and Tetko, Igor V., editor
- Published
- 2024
- Full Text
- View/download PDF
9. Active Inference in Hebbian Learning Networks
- Author
-
Safa, Ali, Keuninckx, Lars, Gielen, Georges, Catthoor, Francky, Safa, Ali, Keuninckx, Lars, Gielen, Georges, and Catthoor, Francky
- Published
- 2024
- Full Text
- View/download PDF
10. Associative Interpretability of Hidden Semantics with Contrastiveness Operators in Face Classification Tasks
- Author
-
Aguilar-Canto, Fernando, García-Vásquez, Omar, Alcántara, Tania, Espinosa-Juárez, Alberto, Calvo, Hiram, Filipe, Joaquim, Editorial Board Member, Ghosh, Ashish, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, Longo, Luca, editor, Lapuschkin, Sebastian, editor, and Seifert, Christin, editor
- Published
- 2024
- Full Text
- View/download PDF
11. Learning Hidden Markov Model of Stochastic Environment with Bio-inspired Probabilistic Temporal Memory
- Author
-
Dzhivelikian, Evgenii, Kuderov, Petr, Panov, Aleksandr I., Kacprzyk, Janusz, Series Editor, Samsonovich, Alexei V., editor, and Liu, Tingting, editor
- Published
- 2024
- Full Text
- View/download PDF
12. Transitional probabilities outweigh frequency of occurrence in statistical learning of simultaneously presented visual shapes
- Author
-
Endress, Ansgar D.
- Published
- 2024
- Full Text
- View/download PDF
13. Predictive coding with spiking neurons and feedforward gist signaling.
- Author
-
Kwangjun Lee, Dora, Shirin, Mejias, Jorge F., Bohte, Sander M., and Pennartz, Cyriel M. A.
- Subjects
ARTIFICIAL neural networks ,MACHINE learning ,PERCEPTUAL learning ,NEURAL codes ,LINEAR network coding - Abstract
Predictive coding (PC) is an influential theory in neuroscience, which suggests the existence of a cortical architecture that is constantly generating and updating predictive representations of sensory inputs. Owing to its hierarchical and generative nature, PC has inspired many computational models of perception in the literature. However, the biological plausibility of existing models has not been sufficiently explored due to their use of artificial neurons that approximate neural activity with firing rates in the continuous time domain and propagate signals synchronously. Therefore, we developed a spiking neural network for predictive coding (SNN-PC), in which neurons communicate using event-driven and asynchronous spikes. Adopting the hierarchical structure and Hebbian learning algorithms from previous PC neural network models, SNN-PC introduces two novel features: (1) a fast feedforward sweep from the input to higher areas, which generates a spatially reduced and abstract representation of input (i.e., a neural code for the gist of a scene) and provides a neurobiological alternative to an arbitrary choice of priors; and (2) a separation of positive and negative error-computing neurons, which counters the biological implausibility of a bidirectional error neuron with a very high baseline firing rate. After training with the MNIST handwritten digit dataset, SNN-PC developed hierarchical internal representations and was able to reconstruct samples it had not seen during training. SNN-PC suggests biologically plausible mechanisms by which the brain may perform perceptual inference and learning in an unsupervised manner. In addition, it may be used in neuromorphic applications that can utilize its energy-efficient, event-driven, local learning, and parallel information processing nature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Spiking representation learning for associative memories
- Author
-
Naresh Ravichandran, Anders Lansner, and Pawel Herman
- Subjects
spiking neural networks ,associative memory ,attractor dynamics ,Hebbian learning ,structural plasticity ,BCPNN ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Networks of interconnected neurons communicating through spiking signals offer the bedrock of neural computations. Our brain’s spiking neural networks have the computational capacity to achieve complex pattern recognition and cognitive functions effortlessly. However, solving real-world problems with artificial spiking neural networks (SNNs) has proved to be difficult for a variety of reasons. Crucially, scaling SNNs to large networks and processing large-scale real-world datasets have been challenging, especially when compared to their non-spiking deep learning counterparts. The critical operation that is needed of SNNs is the ability to learn distributed representations from data and use these representations for perceptual, cognitive and memory operations. In this work, we introduce a novel SNN that performs unsupervised representation learning and associative memory operations leveraging Hebbian synaptic and activity-dependent structural plasticity coupled with neuron-units modelled as Poisson spike generators with sparse firing (~1 Hz mean and ~100 Hz maximum firing rate). Crucially, the architecture of our model derives from the neocortical columnar organization and combines feedforward projections for learning hidden representations and recurrent projections for forming associative memories. We evaluated the model on properties relevant for attractor-based associative memories such as pattern completion, perceptual rivalry, distortion resistance, and prototype extraction.
- Published
- 2024
- Full Text
- View/download PDF
15. Dynamic control of sequential retrieval speed in networks with heterogeneous learning rules
- Author
-
Maxwell Gillett and Nicolas Brunel
- Subjects
sequential activity ,temporal rescaling ,hebbian learning ,Medicine ,Science ,Biology (General) ,QH301-705.5 - Abstract
Temporal rescaling of sequential neural activity has been observed in multiple brain areas during behaviors involving time estimation and motor execution at variable speeds. Temporally asymmetric Hebbian rules have been used in network models to learn and retrieve sequential activity, with characteristics that are qualitatively consistent with experimental observations. However, in these models sequential activity is retrieved at a fixed speed. Here, we investigate the effects of a heterogeneity of plasticity rules on network dynamics. In a model in which neurons differ by the degree of temporal symmetry of their plasticity rule, we find that retrieval speed can be controlled by varying external inputs to the network. Neurons with temporally symmetric plasticity rules act as brakes and tend to slow down the dynamics, while neurons with temporally asymmetric rules act as accelerators of the dynamics. We also find that such networks can naturally generate separate ‘preparatory’ and ‘execution’ activity patterns with appropriate external inputs.
- Published
- 2024
- Full Text
- View/download PDF
16. MANC: A masked autoencoder neural cryptography based encryption scheme for CT scan images
- Author
-
Kishore Kumar, Sarvesh Tanwar, and Shishir Kumar
- Subjects
Secret sharing ,Auto encoder ,Tree parity machine ,Hebbian learning ,Image encryption ,Masked transformer ,Science - Abstract
Sharing medical images securely is very important towards keeping patients’ data confidential. In this paper we propose MANC: a Masked Autoencoder Neural Cryptography based encryption scheme for sharing medical images. The proposed technique builds upon recently proposed masked autoencoders. In the original paper, the masked autoencoders are used as scalable self-supervised learners for computer vision which reconstruct portions of originally patched images. Here, the facility to obfuscate portions of input image and the ability to reconstruct original images is used an encryption-decryption scheme. In the final form, masked autoencoders are combined with neural cryptography consisting of a tree parity machine and Shamir Scheme for secret image sharing. The proposed technique MANC helps to recover the loss in image due to noise during secret sharing of image. • Uses recently proposed masked autoencoders, originally designed as scalable self-supervised learners for computer vision, in an encryption-decryption setup. • Combines autoencoders with neural cryptography - the advantage our proposed approach offers over existing technique is that (i) Neural cryptography is a new type of public key cryptography that is not based on number theory, requires less computing time and memory and is non-deterministic in nature, (ii) masked auto-encoders provide additional level of obfuscation through their deep learning architecture. • The proposed scheme was evaluated on dataset consisting of CT scans made public by The Cancer Imaging Archive (TCIA). The proposed method produces better RMSE values between the input the encrypted image and comparable correlation values between the input and the output image with respect to the existing techniques.
- Published
- 2024
- Full Text
- View/download PDF
17. Tyro3 promotes the maturation of glutamatergic synapses.
- Author
-
Sheng Miao, Fourgeaud, Lawrence, Burrola, Patrick G., Stern, Shani, Zhang, Yuhan, Happonen, Kaisa E., Novak, Sammy Weiser, Gage, Fred H., and Lemke, Greg
- Subjects
SYNAPSES ,GLUTAMATE receptors ,PROTEIN-tyrosine kinases ,AMPA receptors ,CELL membranes ,KINASES - Abstract
The receptor tyrosine kinase Tyro3 is abundantly expressed in neurons of the neocortex, hippocampus, and striatum, but its role in these cells is unknown. We found that neuronal expression of this receptor was markedly up-regulated in the postnatal mouse neocortex immediately prior to the final development of glutamatergic synapses. In the absence of Tyro3, cortical and hippocampal synapses never completed end-stage differentiation and remained electrophysiologically and ultrastructurally immature. Tyro3
-/- cortical neurons also exhibited diminished plasma membrane expression of the GluA2 subunits of AMPA-type glutamate receptors, which are essential to mature synaptic function. Correspondingly, GluA2 membrane insertion in wild-type neurons was stimulated by Gas6, a Tyro3 ligand widely expressed in the postnatal brain. Behaviorally, Tyro3-/- mice displayed learning enhancements in spatial recognition and fear-conditioning assays. Together, these results demonstrate that Tyro3 promotes the functional maturation of glutamatergic synapses by driving plasma membrane translocation of GluA2 AMPA receptor subunits. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
18. Hebbian Learning-Guided Random Walks for Enhanced Community Detection in Correlation-Based Brain Networks
- Author
-
Sotero, Roberto C., Sanchez-Bornot, Jose M., Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Quaresma, Paulo, editor, Camacho, David, editor, Yin, Hujun, editor, Gonçalves, Teresa, editor, Julian, Vicente, editor, and Tallón-Ballesteros, Antonio J., editor
- Published
- 2023
- Full Text
- View/download PDF
19. A Theoretical Study on Artificial Intelligence Training
- Author
-
Han, Donghyeon, Yoo, Hoi-Jun, Han, Donghyeon, and Yoo, Hoi-Jun
- Published
- 2023
- Full Text
- View/download PDF
20. Brain-like Combination of Feedforward and Recurrent Network Components Achieves Prototype Extraction and Robust Pattern Recognition
- Author
-
Ravichandran, Naresh Balaji, Lansner, Anders, Herman, Pawel, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Nicosia, Giuseppe, editor, Ojha, Varun, editor, La Malfa, Emanuele, editor, La Malfa, Gabriele, editor, Pardalos, Panos, editor, Di Fatta, Giuseppe, editor, Giuffrida, Giovanni, editor, and Umeton, Renato, editor
- Published
- 2023
- Full Text
- View/download PDF
21. Neurocognition and Movement
- Author
-
Voelcker-Rehage, Claudia, Kutz, Dieter F., Julian, Ross, Schüler, Julia, editor, Wegner, Mirko, editor, Plessner, Henning, editor, and Eklund, Robert C., editor
- Published
- 2023
- Full Text
- View/download PDF
22. An Adaptive Network Model Simulating the Effects of Different Culture Types and Leader Qualities on Mistake Handling and Organisational Learning
- Author
-
Samhan, Natalie, Treur, Jan, Kucharska, Wioleta, Wiewiora, Anna, Kacprzyk, Janusz, Series Editor, Cherifi, Hocine, editor, Mantegna, Rosario Nunzio, editor, Rocha, Luis M., editor, Cherifi, Chantal, editor, and Miccichè, Salvatore, editor
- Published
- 2023
- Full Text
- View/download PDF
23. Implementation Challenges and Strategies for Hebbian Learning in Convolutional Neural Networks.
- Author
-
Demidovskij, A. V., Kazyulina, M. S., Salnikov, I. G., Tugaryov, A. M., Trutnev, A. I., and Pavlov, S. V.
- Abstract
Given the unprecedented growth of deep learning applications, training acceleration is becoming a subject of strong academic interest. Hebbian learning as a training strategy alternative to backpropagation presents a promising optimization approach due to its locality, lower computational complexity and parallelization potential. Nevertheless, due to the challenging optimization of Hebbian learning, there is no widely accepted approach to the implementation of such mixed strategies. The current paper overviews the 4 main strategies for updating weights using the Hebbian rule, including its widely used modifications—Oja's and Instar rules. Additionally, the paper analyses 21 industrial implementations of Hebbian learning, discusses merits and shortcomings of Hebbian rules, as well as presents the results of computational experiments on 4 convolutional networks. Experiments show that the most efficient implementation strategy of Hebbian learning allows for acceleration and memory consumption when updating DenseNet121 weights compared to backpropagation. Finally, a comparative analysis of the implementation strategies is carried out and grounded recommendations for Hebbian learning application are formulated. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Local minimization of prediction errors drives learning of invariant object representations in a generative network model of visual perception.
- Author
-
Brucklacher, Matthias, Bohté, Sander M., Mejias, Jorge F., and Pennartz, Cyriel M. A.
- Subjects
VISUAL perception ,LINEAR network coding ,IMAGE representation ,MACAQUES ,FORECASTING ,FUSIFORM gyrus - Abstract
The ventral visual processing hierarchy of the cortex needs to fulfill at least two key functions: perceived objects must be mapped to high-level representations invariantly of the precise viewing conditions, and a generative model must be learned that allows, for instance, to fill in occluded information guided by visual experience. Here, we show how a multilayered predictive coding network can learn to recognize objects from the bottom up and to generate specific representations via a top-down pathway through a single learning rule: the local minimization of prediction errors. Trained on sequences of continuously transformed objects, neurons in the highest network area become tuned to object identity invariant of precise position, comparable to inferotemporal neurons in macaques. Drawing on this, the dynamic properties of invariant object representations reproduce experimentally observed hierarchies of timescales from low to high levels of the ventral processing stream. The predicted faster decorrelation of error-neuron activity compared to representation neurons is of relevance for the experimental search for neural correlates of prediction errors. Lastly, the generative capacity of the network is confirmed by reconstructing specific object images, robust to partial occlusion of the inputs. By learning invariance from temporal continuity within a generative model, the approach generalizes the predictive coding framework to dynamic inputs in a more biologically plausible way than self-supervised networks with non-local error-backpropagation. This was achieved simply by shifting the training paradigm to dynamic inputs, with little change in architecture and learning rule from static input-reconstructing Hebbian predictive coding networks. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. Short duration event related cerebellar TDCS enhances visuomotor adaptation
- Author
-
Matthew Weightman, Neeraj Lalji, Chin-Hsuan Sophie Lin, Joseph M. Galea, Ned Jenkinson, and R. Chris Miall
- Subjects
Transcranial electrical stimulation ,Cerebellum ,Visuomotor adaptation ,Hebbian learning ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Background: Transcranial direct current stimulation (TDCS) is typically applied before or during a task, for periods ranging from 5 to 30 min. Hypothesis: We hypothesise that briefer stimulation epochs synchronous with individual task actions may be more effective. Methods: In two separate experiments, we applied brief bursts of event-related anodal stimulation (erTDCS) to the cerebellum during a visuomotor adaptation task. Results: The first study demonstrated that 1 s duration erTDCS time-locked to the participants’ reaching actions enhanced adaptation significantly better than sham. A close replication in the second study demonstrated 0.5 s erTDCS synchronous with the reaching actions again resulted in better adaptation than standard TDCS, significantly better than sham. Stimulation either during the inter-trial intervals between movements or after movement, during assessment of visual feedback, had no significant effect. Because short duration stimulation with rapid onset and offset is more readily perceived by the participants, we additionally show that a non-electrical vibrotactile stimulation of the scalp, presented with the same timing as the erTDCS, had no significant effect. Conclusions: We conclude that short duration, event related, anodal TDCS targeting the cerebellum enhances motor adaptation compared to the standard model. We discuss possible mechanisms of action and speculate on neural learning processes that may be involved.
- Published
- 2023
- Full Text
- View/download PDF
26. Exploration and extension of the similarity matching framework : feature learning, nonlinear methods and transformation learning
- Author
-
Bahroun, Yanis
- Subjects
006.3 ,Feature learning ,Neural networks ,Unsupervised learning ,Sparse coding ,Dimensionality reduction ,Hebbian learning ,Online learning - Abstract
Similarity matching (SM) is a framework introduced recently for deriving biologically plausible neural networks from objective functions. Three key biological properties associated with these networks are 1) Hebbian rules, 2) unsupervised learning, and 3) online implementations. In particular previous work has demonstrated that unconstrained-in-sign SM (USM) and nonnegative SM (NSM) can lead to neural networks (NN) performing linear principal subspace projection (PSP) and clustering. Starting from USM and NSM, the work undertaken in this thesis 'explores' capabilities and performance of SM and 'extends' SM to novel sets of NNs and unsupervised learning tasks. The first objective of this work is to explore the capabilities of existing SM NN for feature learning. Representations learned from different SM NN are used as input to a linear classifier to measure their classification accuracy on established image datasets. The NN derived from NSM is employed to learn features from images with single and dual-layer architectures. The simulations show that features learned by NSM are comparable to Gabor filters and that a simple single-layer Hebbian network can outperform more complex models. The NN derived from USM is used for learning features in combination with block-wise histogram and binary-hashing. The proposed set of architectures (USMNet), when evaluated in terms of accuracy, shows competitive against unsupervised learning algorithms and multi-layer networks. Finally, Deep Hebbian Networks (DHN) are proposed. DHN combines within one architecture stages of NSM and USM. The performance of DHNs are evaluated on image classification tasks and outperforms the aforementioned models. The second objective of this work is to extend SM beyond linear methods and static images. To incorporate nonlinearity, kernel-based versions of SM, K-USM and K-NSM, are proposed and map onto NNs performing nonlinear online clustering and PSP, outperforming traditional methods. To incorporate temporal information, a new SM cost-function is applied to pairs of consecutive images to develop the TNSM algorithm. This is mapped onto a NN that performs motion detection and recapitulates several salient features of the fly visual system. The proposed approach is also applicable to the general problem of transformation learning.
- Published
- 2020
- Full Text
- View/download PDF
27. Hebbian Learning
- Author
-
Choe, Yoonsuck, Migliore, Michele, Section editor, Linster, Christiane, Section editor, Cavarretta, Francesco, Section editor, Jaeger, Dieter, editor, and Jung, Ranu, editor
- Published
- 2022
- Full Text
- View/download PDF
28. FastHebb: Scaling Hebbian Training of Deep Neural Networks to ImageNet Level
- Author
-
Lagani, Gabriele, Gennaro, Claudio, Fassold, Hannes, Amato, Giuseppe, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Skopal, Tomáš, editor, Falchi, Fabrizio, editor, Lokoč, Jakub, editor, Sapino, Maria Luisa, editor, Bartolini, Ilaria, editor, and Patella, Marco, editor
- Published
- 2022
- Full Text
- View/download PDF
29. Memristive Models for the Emulation of Biological Learning
- Author
-
Ziegler, Martin, Kohlstedt, Hermann, Chua, Leon O., editor, Tetzlaff, Ronald, editor, and Slavova, Angela, editor
- Published
- 2022
- Full Text
- View/download PDF
30. Evaluating Hebbian Learning in a Semi-supervised Setting
- Author
-
Lagani, Gabriele, Falchi, Fabrizio, Gennaro, Claudio, Amato, Giuseppe, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Nicosia, Giuseppe, editor, Ojha, Varun, editor, La Malfa, Emanuele, editor, La Malfa, Gabriele, editor, Jansen, Giorgio, editor, Pardalos, Panos M., editor, Giuffrida, Giovanni, editor, and Umeton, Renato, editor
- Published
- 2022
- Full Text
- View/download PDF
31. Training Convolutional Neural Networks with Competitive Hebbian Learning Approaches
- Author
-
Lagani, Gabriele, Falchi, Fabrizio, Gennaro, Claudio, Amato, Giuseppe, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Nicosia, Giuseppe, editor, Ojha, Varun, editor, La Malfa, Emanuele, editor, La Malfa, Gabriele, editor, Jansen, Giorgio, editor, Pardalos, Panos M., editor, Giuffrida, Giovanni, editor, and Umeton, Renato, editor
- Published
- 2022
- Full Text
- View/download PDF
32. Dual functional states of working memory realized by memristor-based neural network.
- Author
-
Hongzhe Wang, Xinqiang Pan, Junjie Wang, Mingyuan Sun, Chuangui Wu, Qi Yu, Zhen Liu, Tupei Chen, and Yang Liu
- Subjects
SHORT-term memory ,ACTION potentials ,BIOLOGICALLY inspired computing - Abstract
Working memory refers to the brain's ability to store and manipulate information for a short period. It is disputably considered to rely on two mechanisms: sustained neuronal firing, and "activity-silent" working memory. To develop a highly biologically plausible neuromorphic computing system, it is anticipated to physically realize working memory that corresponds to both of these mechanisms. In this study, we propose a memristor-based neural network to realize the sustained neural firing and activity-silent working memory, which are reflected as dual functional states within memory. Memristor-based synapses and two types of artificial neurons are designed for the Winner-Takes-All learning rule. During the cognitive task, state transformation between the "focused" state and the "unfocused" state of working memory is demonstrated. This work paves the way for further emulating the complex working memory functions with distinct neural activities in our brains. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. Totipotent neural controllers for modular soft robots: Achieving specialization in body–brain co-evolution through Hebbian learning.
- Author
-
Ferigo, Andrea, Iacca, Giovanni, Medvet, Eric, and Nadizar, Giorgia
- Subjects
- *
ARTIFICIAL neural networks , *ARTIFICIAL intelligence , *MACHINE learning , *NEUROPLASTICITY , *EVOLUTIONARY algorithms - Abstract
Multi-cellular organisms typically originate from a single cell, the zygote, that then develops into a multitude of structurally and functionally specialized cells. The potential of generating all the specialized cells that make up an organism is referred to as cellular "totipotency", a concept introduced by the German plant physiologist Haberlandt in the early 1900s. In an attempt to reproduce this mechanism in synthetic organisms, we present a model based on a kind of modular robot called Voxel-based Soft Robot (VSR), where both the body, i.e., the arrangement of voxels, and the brain, i.e., the Artificial Neural Network (ANN) controlling each module, are subject to an evolutionary process aimed at optimizing the locomotion capabilities of the robot. In an analogy between totipotent cells and totipotent ANN-controlled modules, we then include in our model an additional level of adaptation provided by Hebbian learning, which allows the ANNs to adapt their weights during the execution of the locomotion task. Our in silico experiments reveal two main findings. Firstly, we confirm the common intuition that Hebbian plasticity effectively allows better performance and adaptation. Secondly and more importantly, we verify for the first time that the performance improvements yielded by plasticity are in essence due to a form of specialization at the level of single modules (and their associated ANNs): thanks to plasticity, modules specialize to react in different ways to the same set of stimuli, i.e., they become functionally and behaviorally different even though their ANNs are initialized in the same way. This mechanism, which can be seen as a form of totipotency at the level of ANNs, can have, in our view, profound implications in various areas of Artificial Intelligence (AI) and applications thereof, such as modular robotics and multi-agent systems. • We apply Hebbian learning to encourage specialization in voxel-based soft robots. • We show that Hebbian learning effectively allows better performance and adaptation. • Neural networks specialize their behavior based on their position in the robot. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
34. Unsupervised end-to-end training with a self-defined target
- Author
-
Dongshu Liu, Jérémie Laydevant, Adrien Pontlevy, Damien Querlioz, and Julie Grollier
- Subjects
backpropagation ,equilibrium propagation ,Hebbian learning ,edge AI hardware ,unsupervised learning ,semi-supervised learning ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Designing algorithms for versatile AI hardware that can learn on the edge using both labeled and unlabeled data is challenging. Deep end-to-end training methods incorporating phases of self-supervised and supervised learning are accurate and adaptable to input data but self-supervised learning requires even more computational and memory resources than supervised learning, too high for current embedded hardware. Conversely, unsupervised layer-by-layer training, such as Hebbian learning, is more compatible with existing hardware but does not integrate well with supervised learning. To address this, we propose a method enabling networks or hardware designed for end-to-end supervised learning to also perform high-performance unsupervised learning by adding two simple elements to the output layer: winner-take-all selectivity and homeostasis regularization. These mechanisms introduce a ‘self-defined target’ for unlabeled data, allowing purely unsupervised training for both fully-connected and convolutional layers using backpropagation or equilibrium propagation on datasets like MNIST (up to 99.2%), Fashion-MNIST (up to 90.3%), and SVHN (up to 81.5%). We extend this method to semi-supervised learning, adjusting targets based on data type, achieving 96.6% accuracy with only 600 labeled MNIST samples in a multi-layer perceptron. Our results show that this approach can effectively enable networks and hardware initially dedicated to supervised learning to also perform unsupervised learning, adapting to varying availability of labeled data.
- Published
- 2024
- Full Text
- View/download PDF
35. Short duration event related cerebellar TDCS enhances visuomotor adaptation.
- Author
-
Weightman, Matthew, Lalji, Neeraj, Lin, Chin-Hsuan Sophie, Galea, Joseph M., Jenkinson, Ned, and Miall, R. Chris
- Abstract
Transcranial direct current stimulation (TDCS) is typically applied before or during a task, for periods ranging from 5 to 30 min. We hypothesise that briefer stimulation epochs synchronous with individual task actions may be more effective. In two separate experiments, we applied brief bursts of event-related anodal stimulation (erTDCS) to the cerebellum during a visuomotor adaptation task. The first study demonstrated that 1 s duration erTDCS time-locked to the participants' reaching actions enhanced adaptation significantly better than sham. A close replication in the second study demonstrated 0.5 s erTDCS synchronous with the reaching actions again resulted in better adaptation than standard TDCS, significantly better than sham. Stimulation either during the inter-trial intervals between movements or after movement, during assessment of visual feedback, had no significant effect. Because short duration stimulation with rapid onset and offset is more readily perceived by the participants, we additionally show that a non-electrical vibrotactile stimulation of the scalp, presented with the same timing as the erTDCS, had no significant effect. We conclude that short duration, event related, anodal TDCS targeting the cerebellum enhances motor adaptation compared to the standard model. We discuss possible mechanisms of action and speculate on neural learning processes that may be involved. [Display omitted] • Brief event-related TDCS (erTDCS) enhances visuomotor adaptation. • ErTDCS synchronous with reaching-to-target movement is most effective. • No effect of asynchronous erTDCS or during error feedback processing. • ErTDCS may be a useful new protocol to dissect task components of learning. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. Scandium Nitride as a Gateway III‐Nitride Semiconductor for both Excitatory and Inhibitory Optoelectronic Artificial Synaptic Devices.
- Author
-
Rao, Dheemahi, Pillai, Ashalatha Indiradevi Kamalasanan, Garbrecht, Magnus, and Saha, Bivas
- Subjects
SCANDIUM ,NITRIDES ,SEMICONDUCTORS ,NEUROPLASTICITY ,PHOTOCONDUCTIVITY ,COMPLEMENTARY metal oxide semiconductors ,CARRIER density - Abstract
Traditional computation based on von Neumann architecture is limited by time and energy consumption due to data transfer between the storage and the processing units. The von Neumann architecture is also inefficient in solving unstructured, probabilistic, and real‐time problems. To address these challenges, a new brain‐inspired neuromorphic computational architecture is required. Due to the absence of resistance–capacitance delay, high bandwidth, and low power consumption, optoelectronic artificial synaptic devices are highly attractive. Yet, stable, scalable, and complementary metal–oxide–semiconductor (CMOS)‐compatible materials exhibiting both inhibitory and excitatory optoelectronic synaptic functionalities have not been demonstrated. Here, epitaxial CMOS‐compatible scandium nitride (ScN) optoelectronic artificial synaptic devices that emulate both inhibitory and excitatory biological synaptic activities are presented. The negative and positive persistent photoconductivity of undoped and magnesium‐doped ScN is equated to the inhibitory and excitatory synaptic plasticity, respectively, which leads to functionalities like learning–forgetting, frequency‐selective optical filtering, frequency‐dependent potentiation and depression, Hebbian learning, and logic‐gate operations. Temperature‐dependent photoresponse and photo‐Hall measurements reveal that scattering of photogenerated carriers from charged defect centers results in negative photoconductivity in undoped degenerate ScN. This work opens up the possibility of utilizing a group‐III epitaxial semiconducting nitride material with inhibitory and excitatory optoelectronic synaptic functionalities for practical neuromorphic applications. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. Influence of language on perception and concept formation in a brain-constrained deep neural network model.
- Author
-
Henningsen-Schomers, Malte R., Garagnani, Max, and Pulvermüller, Friedemann
- Subjects
- *
ACTIVE learning , *LEARNING , *CONCEPT learning , *SOCIAL interaction , *NEUROLINGUISTICS - Abstract
A neurobiologically constrained model of semantic learning in the human brain was used to simulate the acquisition of concrete and abstract concepts, either with or without verbal labels. Concept acquisition and semantic learning were simulated using Hebbian learning mechanisms. We measured the network's category learning performance, defined as the extent to which it successfully (i) grouped partly overlapping perceptual instances into a single (abstract or concrete) conceptual representation, while (ii) still distinguishing representations for distinct concepts. Co-presence of linguistic labels with perceptual instances of a given concept generally improved the network's learning of categories, with a significantly larger beneficial effect for abstract than concrete concepts. These results offer a neurobiological explanation for causal effects of language structure on concept formation and on perceptuo-motor processing of instances of these concepts: supplying a verbal label during concept acquisition improves the cortical mechanisms by which experiences with objects and actions along with the learning of words lead to the formation of neuronal ensembles for specific concepts and meanings. Furthermore, the present results make a novel prediction, namely, that such 'Whorfian' effects should be modulated by the concreteness/abstractness of the semantic categories being acquired, with language labels supporting the learning of abstract concepts more than that of concrete ones. This article is part of the theme issue 'Concepts in interaction: social engagement and inner experiences'. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Modelling Metaplasticity and Memory Reconsolidation During an Eye-Movement Desensitization and Reprocessing Treatment
- Author
-
Zegerius, Lennart, Treur, Jan, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Samsonovich, Alexei V., editor, Gudwin, Ricardo R., editor, and Simões, Alexandre da Silva, editor
- Published
- 2021
- Full Text
- View/download PDF
39. Brain-Like Approaches to Unsupervised Learning of Hidden Representations - A Comparative Study
- Author
-
Ravichandran, Naresh Balaji, Lansner, Anders, Herman, Pawel, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Farkaš, Igor, editor, Masulli, Paolo, editor, Otte, Sebastian, editor, and Wermter, Stefan, editor
- Published
- 2021
- Full Text
- View/download PDF
40. Scandium Nitride as a Gateway III‐Nitride Semiconductor for both Excitatory and Inhibitory Optoelectronic Artificial Synaptic Devices
- Author
-
Dheemahi Rao, Ashalatha Indiradevi Kamalasanan Pillai, Magnus Garbrecht, and Bivas Saha
- Subjects
artificial optoelectronic synapses ,excitatory and inhibitory synapses ,Hebbian learning ,learning–forgetting ,logic gates ,long‐term memory ,Electric apparatus and materials. Electric circuits. Electric networks ,TK452-454.4 ,Physics ,QC1-999 - Abstract
Abstract Traditional computation based on von Neumann architecture is limited by time and energy consumption due to data transfer between the storage and the processing units. The von Neumann architecture is also inefficient in solving unstructured, probabilistic, and real‐time problems. To address these challenges, a new brain‐inspired neuromorphic computational architecture is required. Due to the absence of resistance–capacitance delay, high bandwidth, and low power consumption, optoelectronic artificial synaptic devices are highly attractive. Yet, stable, scalable, and complementary metal–oxide–semiconductor (CMOS)‐compatible materials exhibiting both inhibitory and excitatory optoelectronic synaptic functionalities have not been demonstrated. Here, epitaxial CMOS‐compatible scandium nitride (ScN) optoelectronic artificial synaptic devices that emulate both inhibitory and excitatory biological synaptic activities are presented. The negative and positive persistent photoconductivity of undoped and magnesium‐doped ScN is equated to the inhibitory and excitatory synaptic plasticity, respectively, which leads to functionalities like learning–forgetting, frequency‐selective optical filtering, frequency‐dependent potentiation and depression, Hebbian learning, and logic‐gate operations. Temperature‐dependent photoresponse and photo‐Hall measurements reveal that scattering of photogenerated carriers from charged defect centers results in negative photoconductivity in undoped degenerate ScN. This work opens up the possibility of utilizing a group‐III epitaxial semiconducting nitride material with inhibitory and excitatory optoelectronic synaptic functionalities for practical neuromorphic applications.
- Published
- 2023
- Full Text
- View/download PDF
41. Using Hebbian Learning for Training Spiking Neural Networks to Control Fingers of Robotic Hands.
- Author
-
Uleru, George-Iulian, Hulea, Mircea, and Manta, Vasile-Ion
- Subjects
FINGERS ,ARTIFICIAL neural networks ,ROBOT hands ,SHAPE memory alloys ,ARTIFICIAL hands ,LEARNING ability - Abstract
Adaptability is one of the main characteristics of the bio-inspired control units for the anthropomorphic robotic hands. This characteristic provides the artificial hands with the ability to learn new motions and to improve the accuracy of the known ones. This paper presents a method to train spiking neural networks (SNNs) to control anthropomorphic fingers using proprioceptive sensors and Hebbian learning. Being inspired from physical guidance (PG), the proposed method eliminates the need for complex processing of the natural hand motions. To validate the proposed concept we implemented an electronic SNN that learns to control using the output of neuromorphic flexion and force sensors, two opposing actuated fingers actuated by shape memory alloys. Learning occurs when the untrained neural paths triggered by a command signal activate concurrently with the sensor specific neural paths that drive the motion detected by the flexion sensors. The results show that a SNN with a few neurons connects by synaptic potentiation the input neurons activated by the command signal to the output neurons which are activated during the passive finger motions. This mechanism is validated for grasping when the SNN is trained to flex simultaneously the index and thumb fingers if a push button is pressed. The proposed concept is suitable for implementing the neural control units of anthropomorphic robots which are able to learn motions by PG with proper sensors configuration. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
42. Towards the Neuroevolution of Low-level artificial general intelligence
- Author
-
Sidney Pontes-Filho, Kristoffer Olsen, Anis Yazidi, Michael A. Riegler, Pål Halvorsen, and Stefano Nichele
- Subjects
neuroevolution ,artificial general intelligence ,spiking neural network ,spike-timing-dependent plasticity ,Hebbian learning ,weight agnostic neural network ,Mechanical engineering and machinery ,TJ1-1570 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
In this work, we argue that the search for Artificial General Intelligence should start from a much lower level than human-level intelligence. The circumstances of intelligent behavior in nature resulted from an organism interacting with its surrounding environment, which could change over time and exert pressure on the organism to allow for learning of new behaviors or environment models. Our hypothesis is that learning occurs through interpreting sensory feedback when an agent acts in an environment. For that to happen, a body and a reactive environment are needed. We evaluate a method to evolve a biologically-inspired artificial neural network that learns from environment reactions named Neuroevolution of Artificial General Intelligence, a framework for low-level artificial general intelligence. This method allows the evolutionary complexification of a randomly-initialized spiking neural network with adaptive synapses, which controls agents instantiated in mutable environments. Such a configuration allows us to benchmark the adaptivity and generality of the controllers. The chosen tasks in the mutable environments are food foraging, emulation of logic gates, and cart-pole balancing. The three tasks are successfully solved with rather small network topologies and therefore it opens up the possibility of experimenting with more complex tasks and scenarios where curriculum learning is beneficial.
- Published
- 2022
- Full Text
- View/download PDF
43. A Second-Order Adaptive Social-Cognitive Agent Model for Prisoner Recidivism
- Author
-
Melman, Dorien, Ploeger, Janne B., Treur, Jan, Filipe, Joaquim, Editorial Board Member, Ghosh, Ashish, Editorial Board Member, Kotenko, Igor, Editorial Board Member, Prates, Raquel Oliveira, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, De La Prieta, Fernando, editor, Mathieu, Philippe, editor, Rincón Arango, Jaime Andrés, editor, El Bolock, Alia, editor, Del Val, Elena, editor, Jordán Prunera, Jaume, editor, Carneiro, João, editor, Fuentes, Rubén, editor, Lopes, Fernando, editor, and Julian, Vicente, editor
- Published
- 2020
- Full Text
- View/download PDF
44. Relating a Reified Adaptive Network’s Emerging Behaviour Based on Hebbian Learning to Its Reified Network Structure
- Author
-
Treur, Jan, Kacprzyk, Janusz, Series Editor, and Treur, Jan
- Published
- 2020
- Full Text
- View/download PDF
45. A Reified Network Model for Adaptive Decision Making Based on the Disconnect-Reconnect Adaptation Principle
- Author
-
Treur, Jan, Kacprzyk, Janusz, Series Editor, and Treur, Jan
- Published
- 2020
- Full Text
- View/download PDF
46. An Adaptive Cognitive Temporal-Causal Network Model of a Mindfulness Therapy Based on Humor
- Author
-
Sahand Mohammadi Ziabari, S., Treur, Jan, Spagnoletti, Paolo, Series Editor, De Marco, Marco, Series Editor, Pouloudi, Nancy, Series Editor, Te'eni, Dov, Series Editor, vom Brocke, Jan, Series Editor, Winter, Robert, Series Editor, Baskerville, Richard, Series Editor, Davis, Fred D., editor, Riedl, René, editor, Léger, Pierre-Majorique, editor, Randolph, Adriane, editor, and Fischer, Thomas, editor
- Published
- 2020
- Full Text
- View/download PDF
47. A Modeling Environment for Dynamic and Adaptive Network Models Implemented in MATLAB
- Author
-
Mohammadi Ziabari, S. Sahand, Treur, Jan, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Yang, Xin-She, editor, Sherratt, Simon, editor, Dey, Nilanjan, editor, and Joshi, Amit, editor
- Published
- 2020
- Full Text
- View/download PDF
48. Differential Evolution Trained Fuzzy Cognitive Map: An Application to Modeling Efficiency in Banking
- Author
-
Jaya Krishna, Gutha, Smruthi, Meesala, Ravi, Vadlamani, Shandilya, Bhamidipati, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Abraham, Ajith, editor, Cherukuri, Aswani Kumar, editor, and Gandhi, Niketa, editor
- Published
- 2020
- Full Text
- View/download PDF
49. On Tensor Distances for Self Organizing Maps: Clustering Cognitive Tasks
- Author
-
Drakopoulos, Georgios, Giannoukou, Ioanna, Mylonas, Phivos, Sioutas, Spyros, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Hartmann, Sven, editor, Küng, Josef, editor, Kotsis, Gabriele, editor, Tjoa, A Min, editor, and Khalil, Ismail, editor
- Published
- 2020
- Full Text
- View/download PDF
50. Biologically Plausible Learning of Text Representation with Spiking Neural Networks
- Author
-
Białas, Marcin, Mirończuk, Marcin Michał, Mańdziuk, Jacek, Goos, Gerhard, Founding Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Woeginger, Gerhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Bäck, Thomas, editor, Preuss, Mike, editor, Deutz, André, editor, Wang, Hao, editor, Doerr, Carola, editor, Emmerich, Michael, editor, and Trautmann, Heike, editor
- Published
- 2020
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.