Information Security and Cryptography Research Group

A Property of the Intrinsic Mutual Information

Matthias Christandl, Renato Renner, and Stefan Wolf

2003, Proceedings version (ISIT 2003): [ChReWo03b].

In the setting where two parties knowing random variables X and Y, respectively, want to generate a secret key by communication accessible to an adversary who additionally knows a finite random variable Z, the so-called intrinsic information between X and $Y$ given Z, proved useful for determining the number of extractable secret key bits. Given a tripartite probability distribution P_XYZ, this information measure is, however, hard to compute in general since a minimization has to be made over all possible discrete-output channels the adversary could use for processing her information Z. We strongly simplify this by showing that it can, without loss of generality, be assumed that the output alphabets of these channels equal their input alphabet; this implies in particular that there exists an optimal channel which achieves the minimum, since the set of such channels is compact. The proofs of our results combine techniques from point-set topology, measure theory, and convex geometry.

BibTeX Citation

@unpublished{ChReWo03a,
    author       = {Matthias Christandl and Renato Renner and Stefan Wolf},
    title        = {A Property of the Intrinsic Mutual Information},
    year         = 2003,
    note         = {Proceedings version (ISIT 2003): \cite{ChReWo03b}},
}

Files and Links