Inspired by the recent paper (L. Ying, Journal of Scientific Computing, 84, 1–14 (2020), we explore the relationship between the mirror descent and the variable metric method. When the metric in the mirror decent is induced by a convex function, whose Hessian is close to the Hessian of the objective function, this method enjoys both robustness from the mirror descent and superlinear convergence for Newton type methods. When applied to a linearly constrained minimization problem, we prove the global and local convergence, both in the continuous and discrete settings. As applications, we compute the Wasserstein gradient flows and Cahn-Hillard equation with degenerate mobility. When formulating these problems using a minimizing movement scheme with respect to a variable metric, our mirror descent algorithm offers a fast convergence speed for the underlying optimization problem while maintaining the total mass and bounds of the solution.
Bibliographical noteFunding Information:
L.W. is partially supported by NSF grant DMS-1846854. M.Y is partially supported by NSF grant DMS-2012439.
© 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
- Degenerate mobility
- Mirror descent
- Variable metric
- Wasserstein gradient flow