2006 - Brugge/Bruges - Belgium

PAGE 2006: Stuart Beal Methodology Session
Matt Hutmacher

Model Selection Using the Minimum Hellinger Distance Criterion (MHDC) with an Introduction to Minimum Hellinger Distance Estimation (MHDE)

Hutmacher, Matthew M , Anand Vidyashankar, and Debu Mukherjee

Pfizer

PDF of presentation

Background:  Model selection algorithms are prevalent in exploratory exposure-response model development.  Of these, likelihood based criteria, such as the chi-sqaure test, the Akaike Information Criterion, and the Schwarz Bayesian Criterion are the most popular, partly because these are easy to apply using the output from current model-fitting software.  The fact that these tests are generally valid for testing hierarchical (nested) models only is often overlooked.  Moreover, it is also unclear how these procedures will perform for smaller sample sizes when the likelihood is mis-specified.

Methods:  The Hellinger Distance is a theoretically robust method for comparing two distributions.  Thus, the minimum Hellinger Distance (HD) procedure is a natural criterion for selecting the structural form between two competing parametric models.  To calculate the Minimum Hellinger Distance Criterion (MHDC), each structural model to be tested is fit to the data.  The HD from each model fit to a nonparametric (smoother) regression of the data is computed.  The model with the minimum HD is selected as the more appropriate parametric model.  The MHDC uses the nonparametric regression as the standard for comparison, since under very general assumptions it converges to the true regression function under a strong metric.  We present simulation studies for fixed effects and simple mixed effects cases to demonstrate this methodology.  The MHDC can be used to motivate Minimum Hellinger Distance Estimation (MHDE), which provides semi-parametric estimates of the model parameters.  The theory of MHDE is developed and estimation bias and variability are assessed by asymptotic methods and a simulation study.

Results:  The MHDC provided similar or better model discrimination for nonhierarchical models, especially when sample sizes were small or the likelihood used in parameter estimation was mis-specified.  Similarly, MHDE maintained similar properties to maximum likelihood or traditional least-squares estimators when the likelihood was known; MHDE has been shown to be asymptotically consistent and normally distributed with appropriate mean and variance.  MHDE provided stable inference when the likelihood was mis-specified.

Conclusion:  MHDC is a viable, novel method for selecting between nonhierarchical models.  MHDE is a semi-parametric estimator and provides efficient methodology for parameter estimation when the likelihood is correctly specified.  Furthermore, MHDE is robust to mis-specification of the error distribution and the regression function.




Reference: PAGE 15 (2006) Abstr 1018 [www.page-meeting.org/?abstract=1018]
Oral Presentation: Stuart Beal Methodology Session
Top