MATLAB Function Reference

condentropy

Estimate the conditional entropy of the stationary signal x given the stationay signal y with independent pairs (x,y) of samples

Syntax

Description

Conditional entropy estimation is, like plain entropy estimation, a two stage process; first a two demensional histogram2 is estimated and thereafter the entropy is calculated. For the explanation of the usage of the descriptor of the histogram see histogram2 .

In case of a disrete stochastic variable i and j in the integer subranges lowerx <= i < upperx and lowery <= j < uppery the descriptor should be selected as [lowerx,upperx,upperx-lowerx;lowery,uppery,uppery-lowery]. The R(epresentation)-unbiased entropy will be estimated.

In case of a continuous stochastic variable the descriptor can be left unspecified. In this case the default descriptor of histogram2 will be used.

The estimate depends on the value of approach

The base of the logarithm determines the unit of measurement. Default base e (nats) is used, alternative choises are 2 (bit) and 10 (Hartley).

As a result the function returns the estimate, the N-bias (Nbias) of the estimate, the estimated standard error sigma and the used descriptor.

See Also

entropy
entropy2
histogram2

Literature

Moddemeijer, R. On Estimation of Entropy and Mutual Information of Continuous Distributions, Signal Processing, 1989, vol. 16, nr. 3, pp. 233-246, abstract , BibTeX ,

For the principle of Minimum Mean Square Error estimation see:

Moddemeijer, R. An efficient algorithm for selecting optimal configurations of AR-coefficients, Twentieth Symp. on Information Theory in the Benelux, May 27-28, 1999, Haasrode (B), pp 189-196, eds. A. Barbé et. al., Werkgemeenschap Informatie- en Communicatietheorie, Enschede (NL), and IEEE Benelux Chapter on Information Theory, ISBN: 90-71048-14-4, abstract , BibTeX ,

Source code

condentropy.m

MATLAB Function Reference


Copyright R. Moddemeijer