285 results on '"Drusvyatskiy, Dmitriy"'
Search Results
52. Noisy Euclidean distance realization: robust facial reduction and the Pareto frontier
53. Projection methods in quantum information science
54. Stochastic Subgradient Method Converges on Tame Functions
55. Orbits of geometric descent
56. Orthogonal Invariance and Identifiability
57. Clarke subgradients for directionally Lipschitzian stratifiable functions
58. Optimality, identifiability, and sensitivity
59. Tilt stability, uniform quadratic growth, and strong metric regularity of the subdifferential
60. Complexity of a Single Face in an Arrangement of s-Intersecting Curves
61. The dimension of semialgebraic subdifferential graphs
62. Generic nondegeneracy in convex optimization
63. Semi-algebraic functions have small subdifferentials
64. Flat minima generalize for low-rank matrix recovery.
65. Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
66. Stochastic algorithms with geometric step decay converge linearly on sharp functions
67. The slope robustly determines convex functions
68. Stochastic Optimization with Decision-Dependent Distributions
69. The euclidean distance degree of orthogonally invariant matrix varieties
70. A note on alternating projections for ill-posed semidefinite feasibility problems
71. Clarke Subgradients for Directionally Lipschitzian Stratifiable Functions
72. Improved Rates for Derivative Free Gradient Play in Strongly Monotone Games
73. Escaping Strict Saddle Points of the Moreau Envelope in Nonsmooth Optimization
74. Projection methods for quantum channel construction
75. Decision-Dependent Risk Minimization in Geometrically Decaying Dynamic Environments
76. GENERIC NONDEGENERACY IN CONVEX OPTIMIZATION
77. Graphical Convergence of Subgradients in Nonconvex Optimization and Learning
78. Stochastic Optimization under Distributional Drift.
79. Conservative and Semismooth Derivatives are Equivalent for Semialgebraic Maps
80. Proximal Methods Avoid Active Strict Saddles of Weakly Convex Functions
81. Composite optimization for robust rank one bilinear sensing
82. The nonsmooth landscape of phase retrieval
83. Pathological Subgradient Dynamics
84. Catalyst for Gradient-based Nonconvex Optimization
85. Stochastic Subgradient Method Converges on Tame Functions
86. Stochastic Model-Based Minimization of Weakly Convex Functions
87. Composite optimization for robust rank one bilinear sensing.
88. From Low Probability to High Confidence in Stochastic Convex Optimization.
89. Subgradient Methods for Sharp Weakly Convex Functions
90. Efficient Quadratic Penalization Through the Partial Minimization Technique
91. An Optimal First Order Method Based on Optimal Quadratic Averaging
92. Sweeping by a tame process
93. The Many Faces of Degeneracy in Conic Optimization
94. A note on alternating projections for ill-posed semidefinite feasibility problems
95. Coordinate Shadows of Semidefinite and Euclidean Distance Matrices
96. Counting Real Critical Points of the Distance to Orthogonally Invariant Matrix Sets
97. Generic nondegeneracy in convex optimization
98. Multiplayer Performative Prediction: Learning in Decision-Dependent Games.
99. Applications of Point Process Convergence
100. Infinite-Variance Central Limit Theory
Catalog
Books, media, physical & digital resources
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.