145,97 €
162,19 €
-10% with code: EXTRA
Universal Estimation of Information Measures for Analog Sources
Universal Estimation of Information Measures for Analog Sources
145,97
162,19 €
  • We will send in 10–14 business days.
Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume…
162.19
  • Publisher:
  • Year: 2009
  • Pages: 104
  • ISBN-10: 1601982305
  • ISBN-13: 9781601982308
  • Format: 15.6 x 23.4 x 0.6 cm, minkšti viršeliai
  • Language: English
  • SAVE -10% with code: EXTRA

Universal Estimation of Information Measures for Analog Sources (e-book) (used book) | bookbook.eu

Reviews

Description

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory

EXTRA 10 % discount with code: EXTRA

145,97
162,19 €
We will send in 10–14 business days.

The promotion ends in 23d.19:08:51

The discount code is valid when purchasing from 10 €. Discounts do not stack.

Log in and for this item
you will receive 1,62 Book Euros!?
  • Author: Qing Wang
  • Publisher:
  • Year: 2009
  • Pages: 104
  • ISBN-10: 1601982305
  • ISBN-13: 9781601982308
  • Format: 15.6 x 23.4 x 0.6 cm, minkšti viršeliai
  • Language: English English

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory

Reviews

  • No reviews
0 customers have rated this item.
5
0%
4
0%
3
0%
2
0%
1
0%
(will not be displayed)