TY - JOUR
T1 - A Comprehensive Study on Ziv-Zakai Lower Bounds on the MMSE
AU - Jeong, Minoh
AU - Dytso, Alex
AU - Cardone, Martina
N1 - Publisher Copyright:
© 2025 IEEE. All rights reserved.
PY - 2025
Y1 - 2025
N2 - This paper explores Bayesian lower bounds on the minimum mean squared error (MMSE) that belong to the wellknown Ziv-Zakai family. The Ziv-Zakai technique relies on connecting the bound to an M-ary hypothesis testing problem. There are three versions of the Ziv-Zakai bound (ZZB): the first version relies on the so-called valley-filling function, the second one is a relaxation of the first bound which omits the valleyfilling function, and the third one, namely the single-point ZZB (SZZB), replaces the integration present in the first two bounds with a single point maximization. The first part of this paper focuses on providing the most general version of the bounds. It is shown that these bounds hold without any assumption on the distribution of the estimand. This makes the bounds applicable to discrete and mixed distributions. Then, the SZZB is extended to an M-ary setting and a version of it that holds for the multivariate setting is provided. In the second part, general properties of these bounds are provided. First, unlike the Bayesian Cramér-Rao bound, it is shown that all the versions of the ZZB tensorize. Second, a characterization of the high-noise asymptotic is provided, which is used to argue about the tightness of the bounds. Third, a complete low-noise asymptotic is provided under the assumptions of mixedinput distributions and Gaussian additive noise channels. In the low-noise, it is shown that the ZZB is generally tight, but there are examples for which the SZZB is not tight. In the third part, the tightness of the bounds is evaluated. First, it is shown that in the low-noise regime the ZZB without the valley-filling function, and, therefore, also the ZZB with the valley-filling function, are tight for mixed-input distributions and Gaussian additive noise channels. Second, for discrete inputs it is shown that the ZZB with the valley-filling function is always sub-optimal, and equal to zero without the valley-filling function. Third, unlike for the ZZB, an example is shown for which the SZZB is tight to the MMSE for discrete inputs. Fourth, sufficient and necessary conditions for the tightness of the bounds are provided. Finally, some examples are provided in which the bounds in the Ziv-Zakai family outperform other well-known Bayesian lower bounds, namely the Cramér-Rao bound and the maximum entropy bound.
AB - This paper explores Bayesian lower bounds on the minimum mean squared error (MMSE) that belong to the wellknown Ziv-Zakai family. The Ziv-Zakai technique relies on connecting the bound to an M-ary hypothesis testing problem. There are three versions of the Ziv-Zakai bound (ZZB): the first version relies on the so-called valley-filling function, the second one is a relaxation of the first bound which omits the valleyfilling function, and the third one, namely the single-point ZZB (SZZB), replaces the integration present in the first two bounds with a single point maximization. The first part of this paper focuses on providing the most general version of the bounds. It is shown that these bounds hold without any assumption on the distribution of the estimand. This makes the bounds applicable to discrete and mixed distributions. Then, the SZZB is extended to an M-ary setting and a version of it that holds for the multivariate setting is provided. In the second part, general properties of these bounds are provided. First, unlike the Bayesian Cramér-Rao bound, it is shown that all the versions of the ZZB tensorize. Second, a characterization of the high-noise asymptotic is provided, which is used to argue about the tightness of the bounds. Third, a complete low-noise asymptotic is provided under the assumptions of mixedinput distributions and Gaussian additive noise channels. In the low-noise, it is shown that the ZZB is generally tight, but there are examples for which the SZZB is not tight. In the third part, the tightness of the bounds is evaluated. First, it is shown that in the low-noise regime the ZZB without the valley-filling function, and, therefore, also the ZZB with the valley-filling function, are tight for mixed-input distributions and Gaussian additive noise channels. Second, for discrete inputs it is shown that the ZZB with the valley-filling function is always sub-optimal, and equal to zero without the valley-filling function. Third, unlike for the ZZB, an example is shown for which the SZZB is tight to the MMSE for discrete inputs. Fourth, sufficient and necessary conditions for the tightness of the bounds are provided. Finally, some examples are provided in which the bounds in the Ziv-Zakai family outperform other well-known Bayesian lower bounds, namely the Cramér-Rao bound and the maximum entropy bound.
KW - Bayesian lower bound
KW - MMSE lower bound
KW - Ziv-Zakai bound
UR - http://www.scopus.com/inward/record.url?scp=85217897793&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85217897793&partnerID=8YFLogxK
U2 - 10.1109/TIT.2025.3541987
DO - 10.1109/TIT.2025.3541987
M3 - Article
AN - SCOPUS:85217897793
SN - 0018-9448
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
ER -