Abstract
In this paper, we propose a new framework to implement interior point method (IPM) in order to solve some very large-scale linear programs (LPs). Traditional IPMs typically use Newton's method to approximately solve a subproblem that aims to minimize a log-barrier penalty function at each iteration. Due its connection to Newton's method, IPM is often classified as second-order method–a genre that is attached with stability and accuracy at the expense of scalability. Indeed, computing a Newton step amounts to solving a large system of linear equations, which can be efficiently implemented if the input data are reasonably sized and/or sparse and/or well-structured. However, in case the above premises fail, then the challenge still stands on the way for a traditional IPM. To deal with this challenge, one approach is to apply the iterative procedure, such as preconditioned conjugate gradient method, to solve the system of linear equations. Since the linear system is different in each iteration, it is difficult to find good pre-conditioner to achieve the overall solution efficiency. In this paper, an alternative approach is proposed. Instead of applying Newton's method, we resort to the alternating direction method of multipliers (ADMM) to approximately minimize the log-barrier penalty function at each iteration, under the framework of primal–dual path-following for a homogeneous self-dual embedded LP model. The resulting algorithm is an ADMM-Based Interior Point Method, abbreviated as ABIP in this paper. The new method inherits stability from IPM and scalability from ADMM. Because of its self-dual embedding structure, ABIP is set to solve any LP without requiring prior knowledge about its feasibility. We conduct extensive numerical experiments testing ABIP with large-scale LPs from NETLIB and machine learning applications. The results demonstrate that ABIP compares favourably with other LP solvers including SDPT3, MOSEK, DSDP-CG and SCS.
Original language | English (US) |
---|---|
Pages (from-to) | 389-424 |
Number of pages | 36 |
Journal | Optimization Methods and Software |
Volume | 36 |
Issue number | 2-3 |
DOIs | |
State | Published - 2021 |
Bibliographical note
Funding Information:We would like to thank two anonymous referees for their insightful and constructive suggestions that greatly improved the presentation of this paper. We would like to thank Zaiwen Wen for fruitful discussions regarding this project and many helps on the codes. The research of Shiqian Ma was supported in part by NSF grants DMS-1953210 and CCF-2007797, and UC Davis CeDAR (Center for Data Science and Artificial Intelligence Research) Innovative Data Science Seed Funding Program.
Funding Information:
The research of Shiqian Ma was supported in part by NSF grants Division of Mathematical Sciences PDMS-1953210 and Division of Computing and Communication Foundations CCF-2007797, and UC Davis CeDAR (Center for Data Science and Artificial Intelligence Research) Innovative Data Science Seed Funding Program. We would like to thank two anonymous referees for their insightful and constructive suggestions that greatly improved the presentation of this paper. We would like to thank Zaiwen Wen for fruitful discussions regarding this project and many helps on the codes. The research of Shiqian Ma was supported in part by NSF grants DMS-1953210 and CCF-2007797, and UC Davis CeDAR (Center for Data Science and Artificial Intelligence Research) Innovative Data Science Seed Funding Program.
Publisher Copyright:
© 2020 Informa UK Limited, trading as Taylor & Francis Group.
Keywords
- ADMM
- Linear programming
- central path following
- homogeneous self-dual embedding
- interior-point method
- iteration complexity