metric and information entropy

Ravi C Venkatesan (rcv_dac@giaspn01.vsnl.net.in)
Sun, 28 Jun 1998 22:37:31 +0500 (GMT+0500)

Hi,

I would be deeply grateful for some advise/pointers on the following
issue: Given two points that lie either within the same statistical
probability distribution (pdf, hereafter), or different pdf's (mixture
models made up of similar or dissimilar pdf's), the interpolation between
them can be obtained.

One such methodology would be to express the distance between any two
given points in the form of a metric. In fact, by applying constraints
(such as the fact that the interpolation must pass through regions
assigned a high probability, and the arc length between points should be
minimized), a variational describing the interpolation between two points
is derivable.

The query which I have is that does anyone konw of a self-consistent
manner in which the metric (signifying the distance between any two given
points lying within a pdf) can be related to the information entropy !.

Many thanks for your time, and I look forward to hearing from you soon !.

Regards


Ravi Venkatessan

P.S. A relationship between ANY distance between two given points lying
within a statistical pdf and the information entropy could be MOST useful,
even if not expressed explicitly in the form of a metric. The pdf's
mentioned here are most typically multi-dimensional Gaussians and
Dirichlet's.