SCORE-BASED GENERATIVE MODELS BREAK THE CURSE OF DIMENSIONALITY IN LEARNING A FAMILY OF SUB-GAUSSIAN PROBABILITY DISTRIBUTIONS

Frank Cole, Yulong Lu

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

While score-based generative models (SGMs) have achieved remarkable successes in enormous image generation tasks, their mathematical foundations are still limited. In this paper, we analyze the approximation and generalization of SGMs in learning a family of sub-Gaussian probability distributions. We introduce a notion of complexity for probability distributions in terms of their relative density with respect to the standard Gaussian measure. We prove that if the log-relative density can be locally approximated by a neural network whose parameters can be suitably bounded, then the distribution generated by empirical score matching approximates the target distribution in total variation with a dimension-independent rate. We illustrate our theory through examples, which include certain mixtures of Gaussians. An essential ingredient of our proof is to derive a dimension-free deep neural network approximation rate for the true score function associated to the forward process, which is interesting in its own right.

Original languageEnglish (US)
StatePublished - 2024
Externally publishedYes
Event12th International Conference on Learning Representations, ICLR 2024 - Hybrid, Vienna, Austria
Duration: May 7 2024May 11 2024

Conference

Conference12th International Conference on Learning Representations, ICLR 2024
Country/TerritoryAustria
CityHybrid, Vienna
Period5/7/245/11/24

Bibliographical note

Publisher Copyright:
© 2024 12th International Conference on Learning Representations, ICLR 2024. All rights reserved.

Fingerprint

Dive into the research topics of 'SCORE-BASED GENERATIVE MODELS BREAK THE CURSE OF DIMENSIONALITY IN LEARNING A FAMILY OF SUB-GAUSSIAN PROBABILITY DISTRIBUTIONS'. Together they form a unique fingerprint.

Cite this