Back to Search Start Over

Automatic Differentiation in Prolog

Authors :
Schrijvers, Tom
Berg, Birthe van den
Riguzzi, Fabrizio
Publication Year :
2023

Abstract

Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function's (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.<br />Comment: accepted for publication in the issues of Theory and Practice of Logic Programming dedicated to ICLP 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.07878
Document Type :
Working Paper