Back to Search Start Over

T-Norms Driven Loss Functions for Machine Learning

Authors :
Marra, Giuseppe
Giannini, Francesco
Diligenti, Michelangelo
Maggini, Marco
Gori, Marco
Source :
Applied Intelligence 2023, Springer
Publication Year :
2019

Abstract

Neural-symbolic approaches have recently gained popularity to inject prior knowledge into a learner without requiring it to induce this knowledge from data. These approaches can potentially learn competitive solutions with a significant reduction of the amount of supervised data. A large class of neural-symbolic approaches is based on First-Order Logic to represent prior knowledge, relaxed to a differentiable form using fuzzy logic. This paper shows that the loss function expressing these neural-symbolic learning tasks can be unambiguously determined given the selection of a t-norm generator. When restricted to supervised learning, the presented theoretical apparatus provides a clean justification to the popular cross-entropy loss, which has been shown to provide faster convergence and to reduce the vanishing gradient problem in very deep structures. However, the proposed learning formulation extends the advantages of the cross-entropy loss to the general knowledge that can be represented by a neural-symbolic method. Therefore, the methodology allows the development of a novel class of loss functions, which are shown in the experimental results to lead to faster convergence rates than the approaches previously proposed in the literature.

Details

Database :
arXiv
Journal :
Applied Intelligence 2023, Springer
Publication Type :
Report
Accession number :
edsarx.1907.11468
Document Type :
Working Paper
Full Text :
https://doi.org/10.1007/s10489-022-04383-6