Back to Search Start Over

Deep ReLU networks and high-order finite element methods II: Chebyšev emulation.

Authors :
Opschoor, Joost A.A.
Schwab, Christoph
Source :
Computers & Mathematics with Applications. Sep2024, Vol. 169, p142-162. 21p.
Publication Year :
2024

Abstract

We show expression rates and stability in Sobolev norms of deep feedforward ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions T of a bounded interval (a , b). Novel constructions of ReLU NN surrogates encoding function approximations in terms of Chebyšev polynomial expansion coefficients are developed which require fewer neurons than previous constructions. Chebyšev coefficients can be computed easily from the values of the function in the Clenshaw–Curtis points using the inverse fast Fourier transform. Bounds on expression rates and stability are obtained that are superior to those of constructions based on ReLU NN emulations of monomials as considered in [24,22]. All emulation bounds are explicit in terms of the (arbitrary) partition of the interval, the target emulation accuracy and the polynomial degree in each element of the partition. ReLU NN emulation error estimates are provided for various classes of functions and norms, commonly encountered in numerical analysis. In particular, we show exponential ReLU emulation rate bounds for analytic functions with point singularities and develop an interface between Chebfun approximations and constructive ReLU NN emulations. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08981221
Volume :
169
Database :
Academic Search Index
Journal :
Computers & Mathematics with Applications
Publication Type :
Academic Journal
Accession number :
178908857
Full Text :
https://doi.org/10.1016/j.camwa.2024.06.008