FIND ME ON

GitHub

LinkedIn

Divergence Bound on KT

🌱

Theorem
InfoTheory

Let qnq^{n} be the KT coding distribution on Xn\mathcal{X}^{n}, where X=m|\mathcal{X}|=m. Then for any pmf p(x)p(x) on X\mathcal{X} and iid source distribution pn(xn)=i=1np(xi)p^{n}(x^{n})=\prod_{i=1}^{n}p(x_{i}) 1nD(pnqn)m1logn2n+K0n,  n1\frac{1}{n}D(p^{n}\|q^{n})\le \frac{m-1\log n}{2n}+\frac{K_{0}}{n}, \ \ \forall n\ge1for some constant K0K_{0}.