Distances between random orthogonal matrices and independent normals

Tiefeng Jiang, Yutao Ma

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Let Γn be an n × n Haar-invariant orthogonal matrix. Let Zn be the p × q upper-left submatrix of Γn, where p = pn and q = qn are two positive integers. Let Gn be a p × q matrix whose pq entries are independent standard normals. In this paper we consider the distance betweennZn and Gn in terms of the total variation distance, the Kullback-Leibler distance, the Hellinger distance, and the Euclidean distance. We prove that each of the first three distances goes to zero as long as pq/n goes to zero, and not so if (p, q) sits on the curve pq = σn, whereσ is a constant. However, it is different for the Euclidean distance, which goes to zero provided pq2/n goes to zero, and not so if (p, q) sitsonthecurvepq2 = σn. A previous work by Jiang (2006) shows that the total variation distance goes to zero if both p/n and q/n go to zero, and it is not true provided p = cn and q = dn with c and d being constants. One of the above results confirms a conjecture that the total variation distance goes to zero as long as pq/n → 0 and the distance does not go to zero if pq = σn for some constant σ.

Original languageEnglish (US)
Pages (from-to)1509-1553
Number of pages45
JournalTransactions of the American Mathematical Society
Volume372
Issue number3
DOIs
StatePublished - Aug 1 2019

Bibliographical note

Publisher Copyright:
© 2019 American Mathematical Society.

Keywords

  • Convergence of probability measure
  • Haar measure
  • Orthogonal group
  • Random matrix

Fingerprint

Dive into the research topics of 'Distances between random orthogonal matrices and independent normals'. Together they form a unique fingerprint.

Cite this