Abstract
It is well known that central order statistics exhibit a central limit behavior and converge to a Gaussian distribution as the sample size grows. This paper strengthens this known result by establishing an entropic version of the central limit theorem that ensures a stronger mode of convergence using the relative entropy. This upgrade in convergence is shown at the expense of extra regularity conditions, which can be considered as mild. To prove this result, ancillary results on order statistics are derived, which might be of independent interest. For instance, a rather general bound on the moments of order statistics, and an upper bound on the mean squared error of estimating the p (0,1) -th quantile of an unknown cumulative distribution function, are derived. Finally, a discussion on the necessity of the derived conditions for convergence and on the rate of convergence and monotonicity of the relative entropy is provided.
Original language | English (US) |
---|---|
Pages (from-to) | 2193-2205 |
Number of pages | 13 |
Journal | IEEE Transactions on Information Theory |
Volume | 69 |
Issue number | 4 |
DOIs | |
State | Published - Apr 1 2023 |
Externally published | Yes |
Bibliographical note
Funding Information:The authors would like to thank the Associate Editor and the Reviewers for their suggestions and for a speedy review process.
Publisher Copyright:
IEEE
Keywords
- Central limit theorem
- median
- order statistics
- quantiles
- relative entropy