Click here for full text:
The Empirical Distribution of RateConstrained Source Codes
Weissman, Tsachy; Ordentlich, Erik
HPL2003253
Keyword(s): ratedistortion theory; sample converse; denoising; empirical distribution
Abstract:Please Note. This abstract contains mathematical formulae
which cannot be represented here. Let X = (X_{1}, ...) be a stationary
ergodic finitealphabet source, X^{n} denote its first
n symbols, and Y^{n} be the codeword assigned to X^{n}
by a lossy source code. The empirical kthorder joint distribution
^{k}[X^{n},
Y^{n}](x^{k}, y^{k}) is defined as the frequency
of appearances of pairs of kstrings (x^{k}, y^{k}) along
the pair (X^{n}, Y^{n}). Our main interest is in the sample
behavior of this (random) distribution. Letting I(Q^{k}) denote
the mutual information I( X^{k}; Y^{k} ) when (X^{k},
Y^{k}) _{˜} Q^{k} we show that for any
(sequence of) lossy source code(s) of rate ≤ R
where (X)
denotes the entropy rate of X. This is shown to imply, for a large
class of sources including all i.i.d. sources and all sources satisfying
the Shannon lower bound with equality, that for any sequence of codes
which is good in the sense of asymptotically attaining a point
on the rate distortion curve
whenever P_{X}k_{Y}k is the unique distribution attaining the
minimum in the definition of the kthorder rate distortion function. Further
consequences of these results are explored. These include a simple proof
of Kieffer's sample converse to lossy source coding, as well as pointwise
performance bounds for compressionbased denoisers. Notes:
27 Pages
Back to Index
