On the Jensen - Shannon divergence and variational distance

Shi-Chun Tsai*, Wen-Guey Tzeng, Hsin Lung Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

11 Scopus citations


We study the distance measures between two probability distributions via two different distance metrics, a new metric induced from Jensen-Shannon divergence, and the well known L1 metric. We show that several important results and constructions in computational complexity under the L1 metric carry over to the new metric, such as Yao's next-bit predictor, the existence of extractors, the leftover hash lemma, and the construction of expander graph based extractor. Finally, we show that the useful parity lemma in studying pseudorandomness does not hold in the new metric.

Original languageEnglish
Pages (from-to)3333-3336
Number of pages4
JournalIEEE Transactions on Information Theory
Issue number9
StatePublished - 1 Sep 2005


  • Expander
  • Extractors
  • Jensen-Shannon divergence
  • Left-over hash lemma
  • Parity lemma

Fingerprint Dive into the research topics of 'On the Jensen - Shannon divergence and variational distance'. Together they form a unique fingerprint.

Cite this