Maximum‐likelihood estimation for constrained‐ or missing‐data models

Alan E. Gelfand, Bradley P. Carlin

Research output: Contribution to journalArticlepeer-review

27 Scopus citations

Abstract

In statistical models involving constrained or missing data, likelihoods containing integrals emerge. In the case of both constrained and missing data, the result is a ratio of integrals, which for multivariate data may defy exact or approximate analytic expression. Seeking maximum‐likelihood estimates in such settings, we propose Monte Carlo approximants for these integrals, and subsequently maximize the resulting approximate likelihood. Iteration of this strategy expedites the maximization, while the Gibbs sampler is useful for the required Monte Carlo generation. As a result, we handle a class of models broader than the customary EM setting without using an EM‐type algorithm. Implementation of the methodology is illustrated in two numerical examples.

Original languageEnglish (US)
Pages (from-to)303-311
Number of pages9
JournalCanadian Journal of Statistics
Volume21
Issue number3
DOIs
StatePublished - Sep 1993

Keywords

  • EM algorithm
  • Gibbs sampler
  • Monte Carlo approximant

Fingerprint Dive into the research topics of 'Maximum‐likelihood estimation for constrained‐ or missing‐data models'. Together they form a unique fingerprint.

Cite this