A Comprehensive Study on Ziv-Zakai Lower Bounds on the MMSE

Minoh Jeong, Alex Dytso, Martina Cardone

Research output: Contribution to journalArticlepeer-review

Abstract

This paper explores Bayesian lower bounds on the minimum mean squared error (MMSE) that belong to the wellknown Ziv-Zakai family. The Ziv-Zakai technique relies on connecting the bound to an M-ary hypothesis testing problem. There are three versions of the Ziv-Zakai bound (ZZB): the first version relies on the so-called valley-filling function, the second one is a relaxation of the first bound which omits the valleyfilling function, and the third one, namely the single-point ZZB (SZZB), replaces the integration present in the first two bounds with a single point maximization. The first part of this paper focuses on providing the most general version of the bounds. It is shown that these bounds hold without any assumption on the distribution of the estimand. This makes the bounds applicable to discrete and mixed distributions. Then, the SZZB is extended to an M-ary setting and a version of it that holds for the multivariate setting is provided. In the second part, general properties of these bounds are provided. First, unlike the Bayesian Cramér-Rao bound, it is shown that all the versions of the ZZB tensorize. Second, a characterization of the high-noise asymptotic is provided, which is used to argue about the tightness of the bounds. Third, a complete low-noise asymptotic is provided under the assumptions of mixedinput distributions and Gaussian additive noise channels. In the low-noise, it is shown that the ZZB is generally tight, but there are examples for which the SZZB is not tight. In the third part, the tightness of the bounds is evaluated. First, it is shown that in the low-noise regime the ZZB without the valley-filling function, and, therefore, also the ZZB with the valley-filling function, are tight for mixed-input distributions and Gaussian additive noise channels. Second, for discrete inputs it is shown that the ZZB with the valley-filling function is always sub-optimal, and equal to zero without the valley-filling function. Third, unlike for the ZZB, an example is shown for which the SZZB is tight to the MMSE for discrete inputs. Fourth, sufficient and necessary conditions for the tightness of the bounds are provided. Finally, some examples are provided in which the bounds in the Ziv-Zakai family outperform other well-known Bayesian lower bounds, namely the Cramér-Rao bound and the maximum entropy bound.

Original languageEnglish (US)
JournalIEEE Transactions on Information Theory
DOIs
StateAccepted/In press - 2025

Bibliographical note

Publisher Copyright:
© 2025 IEEE. All rights reserved.

Keywords

  • Bayesian lower bound
  • MMSE lower bound
  • Ziv-Zakai bound

Fingerprint

Dive into the research topics of 'A Comprehensive Study on Ziv-Zakai Lower Bounds on the MMSE'. Together they form a unique fingerprint.

Cite this