TY - GEN
T1 - TNT
T2 - 32nd IEEE International Parallel and Distributed Processing Symposium Workshops, IPDPSW 2018
AU - Myre, Joseph M.
AU - Frahm, Erich
AU - Lilja, David J.
AU - Saar, Martin O.
PY - 2018/8/3
Y1 - 2018/8/3
N2 - Since its inception by Gauss, the least-squares problem has frequently arisen in science, mathematics, and engineering. Iterative methods, such as Conjugate Gradient Normal Residual (CGNR), have been popular for solving sparse least-squares problems, but have historically been regarded as undesirable for dense applications due to poor convergence. We contend that this traditional 'common knowledge' should be reexamined. Preconditioned CGNR, and perhaps other iterative methods, should be considered alongside standard methods when addressing large dense least-squares problems. In this paper we present TNT, a dynamite method for solving large dense least-squares problems. TNT implements a Cholesky preconditioner for the CGNR fast iterative method. The Cholesky factorization provides a preconditioner that, in the absence of round-off error, would yield convergence in a single iteration. Through this preconditioner and good parallel scaling, TNT provides improved performance over traditional least-squares solvers allowing for accelerated investigations of scientific and engineering problems. We compare a parallel implementation of TNT to parallel implementations of other conventional methods, including the normal equations and the QR method. For the small systems tested (15000 15000 or smaller), it is shown that TNT is capable of producing smaller solution errors and executing up to 16 faster than the other tested methods. We then apply TNT to a representative rock magnetism inversion problem where it yields the best solution accuracy and execution time of all tested methods.
AB - Since its inception by Gauss, the least-squares problem has frequently arisen in science, mathematics, and engineering. Iterative methods, such as Conjugate Gradient Normal Residual (CGNR), have been popular for solving sparse least-squares problems, but have historically been regarded as undesirable for dense applications due to poor convergence. We contend that this traditional 'common knowledge' should be reexamined. Preconditioned CGNR, and perhaps other iterative methods, should be considered alongside standard methods when addressing large dense least-squares problems. In this paper we present TNT, a dynamite method for solving large dense least-squares problems. TNT implements a Cholesky preconditioner for the CGNR fast iterative method. The Cholesky factorization provides a preconditioner that, in the absence of round-off error, would yield convergence in a single iteration. Through this preconditioner and good parallel scaling, TNT provides improved performance over traditional least-squares solvers allowing for accelerated investigations of scientific and engineering problems. We compare a parallel implementation of TNT to parallel implementations of other conventional methods, including the normal equations and the QR method. For the small systems tested (15000 15000 or smaller), it is shown that TNT is capable of producing smaller solution errors and executing up to 16 faster than the other tested methods. We then apply TNT to a representative rock magnetism inversion problem where it yields the best solution accuracy and execution time of all tested methods.
KW - Least Squares
KW - Preconditioned Conjugate Gradient
KW - Rock Magnetism
UR - http://www.scopus.com/inward/record.url?scp=85052195736&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85052195736&partnerID=8YFLogxK
U2 - 10.1109/IPDPSW.2018.00153
DO - 10.1109/IPDPSW.2018.00153
M3 - Conference contribution
AN - SCOPUS:85052195736
SN - 9781538655559
T3 - Proceedings - 2018 IEEE 32nd International Parallel and Distributed Processing Symposium Workshops, IPDPSW 2018
SP - 987
EP - 995
BT - Proceedings - 2018 IEEE 32nd International Parallel and Distributed Processing Symposium Workshops, IPDPSW 2018
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 May 2018 through 25 May 2018
ER -