[UAI] Divergence Measures

From: Blai Bonet (bonet@cs.ucla.edu)
Date: Mon Oct 22 2001 - 07:52:56 PDT

  • Next message: Rothschild Institute: "[UAI] Call-for-Papers: Haifa Winter Workshop on Computer Science and Statistics (Dec. 17-20, 2001)"

    Dear Colleagues,

    I am in need for a "divergence metric" that would render
    Bayesian updating a continuous operator; i.e.,

    if D(p,q) is the divergence between distributions p and q,
    and p^e is the result of updating p with evidence e:

      p^(e)[i] = p[i] P(e|i) / \sum_j p[j] P(e|j)

    where P(.|.) doesn't depend on p. Then, I would like to have

      (1) D(p^e,q^e) <= K*D(p,q)

    where K is a constant (preferable less than one).
    In addition, I need that D(.,.) to be a metric in the space
    of distributions; i.e,

      (2) D(p,q) = 0 iff p = q,
      (3) D(p,q) = D(q,p),
      (4) D(p,q) <= D(p,r) + D(q,r)

    It is easy to see (with a counterexample) that the symmetric
    Kullback-Leibler divergence doesn't satisfy (1).

    I wonder if such D(.,.) exists.

    Thanks in advance

    Blai Bonet



    This archive was generated by hypermail 2b29 : Mon Oct 22 2001 - 08:11:35 PDT