The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

A novel weighted likelihood estimation with empirical Bayes flavor

Author

Summary, in English

We propose a novel approach to estimation, where each individual observation in a random sample is used to derive an estimator of an unknown parameter using the maximum likelihood principle. These individual estimators are then combined as a weighted average to produce the final estimator. The weights are chosen to be proportional to the likelihood function evaluated at the estimators based on each observation. The method can be related to a Bayesian approach, where the prior distribution is data driven. In case of estimating a location parameter of a unimodal density, the prior distribution is the empirical distribution of the sample, and converges to the true distribution that generated the data as the sample size increases.



We provide several examples illustrating the new method, argue for its consistency, and conduct simulation studies to assess the performance of the estimators. It turns out that this straightforward methodology produces consistent estimators, which seem to be comparable with those obtained by the maximum likelihood method.

Publishing year

2015

Language

English

Publication/Series

Working Papers in Statistics

Issue

6

Document type

Working paper

Publisher

Department of Statistics, Lund university

Topic

  • Probability Theory and Statistics

Keywords

  • Consistency
  • data-dependent prior
  • empirical Bayes
  • exponentiated distribution
  • location parameter
  • maximum likelihood estimator
  • super-efficiency
  • unbounded likelihood

Status

Published