### Abstract

Let Γ_{n} be an n × n Haar-invariant orthogonal matrix. Let Z_{n} be the p × q upper-left submatrix of Γ_{n}, where p = p_{n} and q = q_{n} are two positive integers. Let G_{n} be a p × q matrix whose pq entries are independent standard normals. In this paper we consider the distance between^{√}nZ_{n} and G_{n} in terms of the total variation distance, the Kullback-Leibler distance, the Hellinger distance, and the Euclidean distance. We prove that each of the first three distances goes to zero as long as pq/n goes to zero, and not so if (p, q) sits on the curve pq = σn, whereσ is a constant. However, it is different for the Euclidean distance, which goes to zero provided pq^{2}/n goes to zero, and not so if (p, q) sitsonthecurvepq^{2} = σn. A previous work by Jiang (2006) shows that the total variation distance goes to zero if both p/^{√}n and q/^{√}n go to zero, and it is not true provided p = c^{√}n and q = d^{√}n with c and d being constants. One of the above results confirms a conjecture that the total variation distance goes to zero as long as pq/n → 0 and the distance does not go to zero if pq = σn for some constant σ.

Original language | English (US) |
---|---|

Pages (from-to) | 1509-1553 |

Number of pages | 45 |

Journal | Transactions of the American Mathematical Society |

Volume | 372 |

Issue number | 3 |

DOIs | |

State | Published - Aug 1 2019 |

### Keywords

- Convergence of probability measure
- Haar measure
- Orthogonal group
- Random matrix

## Fingerprint Dive into the research topics of 'Distances between random orthogonal matrices and independent normals'. Together they form a unique fingerprint.

## Cite this

*Transactions of the American Mathematical Society*,

*372*(3), 1509-1553. https://doi.org/10.1090/tran/7470