#
The distribution of entropy estimators based on maximum mean log-likelihood

University of Groningen, Department of Computing Science,

P.O. Box 800, NL-9700 AV Groningen, The Netherlands,

phone: +31.50.363 3940 - fax: +31.50.363 38005
- e-mail: rudy@cs.rug.nl

####
Abstract

Entropy estimation is often based on the Maximum Likelihood (ML)
method. When the probability density function sufficiently models the
reality, the maximum average log-likelihood is a good (negative-)entropy
estimator.
Previous work from the author suggests that, under certain conditions, the
variance of such a statistic consists of a basic variance, plus a number of
statistically independent contributions corresponding with the independently
adjustable parameters of the pdf. Conform this assumption we derive and
justify under certain conditions the distribution of a ML-based entropy
estimator.

This knowledge about the distribution can be used to bridge the gap between
process and application of optimal models, in particular for the selection of
an optimal probability density function in the presence of superfluous
parameters.

####
Published

Twenty-first Symposium on Information Theory in the Benelux,
May 25-26, 2000, Wassenaar (NL),
pp 231-238,
eds. J. Biemond;
Werkgemeenschap Informatie- en
Communicatietheorie, Enschede, The Netherlands,
and
IEEE Benelux Chapter on Information Theory,
ISBN 90-71048
BibTeX

other publications