Back to Search Start Over

Separable Approximations of Optimal Value Functions and Their Representation by Neural Networks

Authors :
Sperl, Mario
Saluzzi, Luca
Kalise, Dante
Grüne, Lars
Publication Year :
2025

Abstract

The use of separable approximations is proposed to mitigate the curse of dimensionality related to the approximation of high-dimensional value functions in optimal control. The separable approximation exploits intrinsic decaying sensitivity properties of the system, where the influence of a state variable on another diminishes as their spatial, temporal, or graph-based distance grows. This property allows the efficient representation of global functions as a sum of localized contributions. A theoretical framework for constructing separable approximations in the context of optimal control is proposed by leveraging decaying sensitivity in both discrete and continuous time. Results extend prior work on decay properties of solutions to Lyapunov and Riccati equations, offering new insights into polynomial and exponential decay regimes. Connections to neural networks are explored, demonstrating how separable structures enable scalable representations of high-dimensional value functions while preserving computational efficiency.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2502.08559
Document Type :
Working Paper