Constrained maximum likelihood exemplified by isotonic convex logistic regression

Research output: Contribution to journalArticlepeer-review

37 Scopus citations


Maximum likelihood estimates in problems in which the likelihood is smooth and the parameter space is defined by linear or smooth nonlinear inequality constraints can be obtained using available nonlinear optimization packages, such as NPSOL. This class of models includes generalized linear models with order restrictions, convexity, or smoothness constraints on the parameters (smoothness constraints being in the form of bounds on finite differences of the regression function). Methods for such models are demonstrated by an analysis of data from four published studies of the incidence of Down’s syndrome in single-year maternal age intervals. Constrained logistic regression of the incidence of Down’s syndrome on maternal age shows that the data are fitted well by a nondecreasing convex function. P values for the likelihood ratio test of this model against alternatives are obtained by the parametric bootstrap and iterated parametric bootstrap. The iterated bootstrap is used as a diagnostic tool: It demonstrates that the bootstrap works in addition to providing a correction term. These methods provide a general approach applicable to most models specified by constraints on the parameter space.

Original languageEnglish (US)
Pages (from-to)717-724
Number of pages8
JournalJournal of the American Statistical Association
Issue number415
StatePublished - Sep 1991

Bibliographical note

Funding Information:
* Charles J. Geyer is Assistant Professor, School of Statistics, University of Minnesota, Minneapolis, MN 55455. This research was supported in part by National Science Foundation grants BSR-8619760 and DMS-9007833 and was part of a Ph.D. dissertation done at the University of Washington under the supervision of Elizabeth Thompson, who helped with many aspects of this work. Some work was also done during a postdoctoral year in the Department of Statistics, University of Chicago. James Burke, R. T. Rockafellar, and Michael Saunders helped the author get started with numerical optimization and suggested using MINOS and NPSOL. Conversations with Michael Newton and Jon Wellner helped with the bootstrap and iterated bootstrap. Steven Self, Nuala Sheehan, Tim Haas, and several sets of referees, associate editors, and editors made helpful criticisms of drafts.


  • Bootstrap diagnostic
  • Iterated bootstrap
  • Nonlinear optimization
  • Nonstandard likelihood ratio test
  • Parametric bootstrap
  • Prepivoting
  • Smoothing


Dive into the research topics of 'Constrained maximum likelihood exemplified by isotonic convex logistic regression'. Together they form a unique fingerprint.

Cite this