From maximum likelyhood to an entropy estimate

R. Moddemeijer

University of Twente, Department of Electrical Engineering,
P.O. Box 217, NL-7500 AE Enschede, The Netherlands,

present addres

University of Groningen, Department of Computing Science,
P.O. Box 800, NL-9700 AV Groningen, The Netherlands,
phone: +31.50.363 3940 - fax: +31.50.363 38005 - e-mail: rudy@cs.rug.nl

Abstract

We present some preliminary thoughts about a method to estimate entropies using the maximum likelihood method. The necessary redefinition and generalization of the maximum likelihood method to obtain this estimate are presented. The estimated maximum of the mean log-likelihood function is interpreted as a neg(ative)-entropy estimate. Two seperate sources of bias can be distinguished: R-bias; caused by insufficient representation of the "true" probability density function by its estimate and N-bias; due to the finite sample size. Also the variance of our (entropy or mean log-likelihood) estimate is presented.

Published

Eighth Symposium on Information Theory in the Benelux, May 21-22, 1987, Deventer, The Netherlands, pp. 86-92, Ed. Kleima, D., Werkgemeenschap Informatie- en Communicatietheorie, Enschede, ISBN 90-71048-03-9, BibTeX
other publications