Back to Search Start Over

SPECTRAL CONVERGENCE OF DIFFUSION MAPS: IMPROVED ERROR BOUNDS AND AN ALTERNATIVE NORMALIZATION.

Authors :
WORMELL, CAROLINE L.
REICH, SEBASTIAN
Source :
SIAM Journal on Numerical Analysis. 2021, Vol. 59 Issue 3, p1687-1734. 48p.
Publication Year :
2021

Abstract

Diffusion maps is a manifold learning algorithm widely used for dimensionality reduction. Using a sample from a distribution, it approximates the eigenvalues and eigenfunctions of associated Laplace-Beltrami operators. Theoretical bounds on the approximation error are, however, generally much weaker than the rates that are seen in practice. This paper uses new approaches to improve the error bounds in the model case where the distribution is supported on a hypertorus. For the data sampling (variance) component of the error we make spatially localized compact embedding estimates on certain Hardy spaces; we study the deterministic (bias) component as a perturbation of the Laplace-Beltrami operator's associated PDE and apply relevant spectral stability results. Using these approaches, we match long-standing pointwise error bounds for both the spectral data and the norm convergence of the operator discretization. We also introduce an alternative normalization for diffusion maps based on Sinkhorn weights. This normalization approximates a Langevin diffusion on the sample and yields a symmetric operator approximation. We prove that it has better convergence compared with the standard normalization on flat domains, and we present a highly efficient rigorous algorithm to compute the Sinkhorn weights. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00361429
Volume :
59
Issue :
3
Database :
Academic Search Index
Journal :
SIAM Journal on Numerical Analysis
Publication Type :
Academic Journal
Accession number :
151385861
Full Text :
https://doi.org/10.1137/20M1344093