New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

  • Leandro Pardo
Publisher:MDPIISBN 13: 9783038979364ISBN 10: 3038979368

Paperback & Hardcover deals ―

Amazon IndiaGOFlipkart GOSnapdealGOSapnaOnlineGOJain Book AgencyGOBooks Wagon₹5,948Book ChorGOCrosswordGODC BooksGO

e-book & Audiobook deals ―

Amazon India GOGoogle Play Books GOAudible GO

* Price may vary from time to time.

* GO = We're not able to fetch the price (please check manually visiting the website).

Know about the book -

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures is written by Leandro Pardo and published by MDPI. It's available with International Standard Book Number or ISBN identification 3038979368 (ISBN 10) and 9783038979364 (ISBN 13).

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.