Relations among Fisher, Shannon-Wiener and Kullback Measures of Information for Continuous Variables

Anton Cedilnik and Katarina Košmelj

Abstract

In statistics, Fisher was the first to introduce the measure of the amount of information supplied by the data about the unknown parameter. We analyze the disadvantages of Fisher information measure for optimization of sampling designs. To overcome this problem, we modify Fisher information measure and we upgrade it to the multivariate setting. It turns out that a reasonable modification of Fisher information measure leads to a special case of Kullback information measure, both in the univariate and multivariate setting. Using Shannon's and Wiener's concept of information we also show a simple derivation of Kullback information measure for a special case when the prior distribution of the parameter is uniform and the posterior distribution is truncated normal.