11 results
Search Results
2. Online regularized learning algorithm for functional data
3. Bypassing the quadrature exactness assumption of hyperinterpolation on the sphere.
4. Enhancing the applicability of Chebyshev-like method.
5. Linear Monte Carlo quadrature with optimal confidence intervals.
6. On a class of linear regression methods.
7. Tamed-adaptive Euler-Maruyama approximation for SDEs with superlinearly growing and piecewise continuous drift, superlinearly growing and locally Hölder continuous diffusion.
8. Nonlinear Tikhonov regularization in Hilbert scales for inverse learning.
9. Randomized complexity of parametric integration and the role of adaption I. Finite dimensional case.
10. Optimal recovery and generalized Carlson inequality for weights with symmetry properties.
11. Convergence of the Gauss-Newton method for convex composite optimization problems under majorant condition on Riemannian manifolds.
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.