Search

Showing total 297 results
297 results

Search Results

1. Gradient regularization of Newton method with Bregman distances.

2. Convergence rates of the Heavy-Ball method under the Łojasiewicz property.

3. Characterizing quasiconvexity of the pointwise infimum of a family of arbitrary translations of quasiconvex functions, with applications to sums and quasiconvex optimization.

4. Robust convex optimization: A new perspective that unifies and extends.

5. Fast Augmented Lagrangian Method in the convex regime with convergence guarantees for the iterates.

6. Difference of convex algorithms for bilevel programs with applications in hyperparameter selection.

7. Worst-case complexity of cyclic coordinate descent: O(n2) gap with randomized version.

8. Optimal matroid bases with intersection constraints: valuated matroids, M-convex functions, and their applications.

9. Universal gradient methods for convex optimization problems.

10. Ideal formulations for constrained convex optimization problems with indicator variables.

11. On t-branch split cuts for mixed-integer programs.

12. Fast alternating linearization methods for minimizing the sum of two convex functions.

13. On conic QPCCs, conic QCQPs and completely positive programs.

14. Convergence analysis of iterative methods for nonsmooth convex optimization over fixed point sets of quasi-nonexpansive mappings.

15. Subdifferential of the supremum function: moving back and forth between continuous and non-continuous settings.

16. New metric properties for prox-regular sets.

17. Simple bilevel programming and extensions.

18. Why random reshuffling beats stochastic gradient descent.

19. An improved algorithm for the $$L_2$$ - $$L_p$$ minimization problem.

20. Nonsmooth optimization using Taylor-like models: error bounds, convergence, and termination criteria.

21. Optimality conditions in convex multiobjective SIP.

22. Convex hull of two quadratic or a conic quadratic and a quadratic inequality.

23. On the worst case performance of the steepest descent algorithm for quadratic functions.

24. A family of second-order methods for convex $$\ell _1$$ -regularized optimization.

25. Primal and dual active-set methods for convex quadratic programming.

26. On sublinear inequalities for mixed integer conic programs.

27. Subdifferentials of value functions and optimality conditions for DC and bilevel infinite and semi-infinite programs.

28. Communication-efficient algorithms for decentralized and stochastic optimization.

29. A general double-proximal gradient algorithm for d.c. programming.

30. Characterizations of mixed binary convex quadratic representable sets.

31. Simple examples for the failure of Newton's method with line search for strictly convex minimization.

32. A Schur complement based semi-proximal ADMM for convex quadratic conic programming and extensions.

33. Robust multicategory support matrix machines.

34. Enhanced proximal DC algorithms with extrapolation for a class of structured nonsmooth DC minimization.

35. On variance reduction for stochastic smooth convex optimization with multiplicative noise.

36. On error bound moduli for locally Lipschitz and regular functions.

37. A first order method for finding minimal norm-like solutions of convex optimization problems.

38. Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function.

39. Solving the degree-concentrated fault-tolerant spanning subgraph problem by DC programming.

40. A study of the difference-of-convex approach for solving linear programs with complementarity constraints.

41. Representative functions of maximally monotone operators and bifunctions.

42. Probably certifiably correct k-means clustering.

43. A unified approach to error bounds for structured convex optimization problems.

44. From error bounds to the complexity of first-order descent methods for convex functions.

45. Necessary global optimality conditions for nonlinear programming problems with polynomial constraints.

46. Regularity estimates for convex multifunctions.

47. Convex risk measures for portfolio optimization and concepts of flexibility.

48. Newton methods for nonsmooth convex minimization: connections among -Lagrangian, Riemannian Newton and SQP methods.

49. Typical convex program is very well posed.

50. Notes on L-/M-convex functions and the separation theorems.