1. Correcting auto-differentiation in neural-ODE training
- Author
-
Xu, Yewei, Chen, Shi, Li, Qin, and Wright, Stephen J.
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,FOS: Mathematics ,Numerical Analysis (math.NA) ,Mathematics - Numerical Analysis ,Machine Learning (cs.LG) - Abstract
Does the use of auto-differentiation yield reasonable updates to deep neural networks that represent neural ODEs? Through mathematical analysis and numerical evidence, we find that when the neural network employs high-order forms to approximate the underlying ODE flows (such as the Linear Multistep Method (LMM)), brute-force computation using auto-differentiation often produces non-converging artificial oscillations. In the case of Leapfrog, we propose a straightforward post-processing technique that effectively eliminates these oscillations, rectifies the gradient computation and thus respects the updates of the underlying flow.
- Published
- 2023