Skip to product information
1 of 1
Regular price £86.69 GBP
Regular price £110.00 GBP Sale price £86.69 GBP
Sale Sold out
Free UK Shipping

Freshly Printed - allow 8 days lead

Optimal Estimation of Parameters

A comprehensive and consistent theory of estimation, including a description of a powerful new tool, the generalized maximum capacity estimator.

Jorma Rissanen (Author)

9781107004740, Cambridge University Press

Hardback, published 7 June 2012

170 pages, 8 b/w illus. 3 tables
25.4 x 17.8 x 1.2 cm, 0.51 kg

'In this splendid new book, Jorma Rissanen, the originator of the minimum description length (MDL) principle, puts forward a comprehensive theory of estimation which differs in several ways from the standard Bayesian and frequentist approaches. During the development of MDL over the last 30 years, it gradually emerged that MDL could be viewed, informally, as a maximum probability principle that directly extends Fisher's classical maximum likelihood method to allow for estimation of a model's structural properties. Yet providing a formal link between MDL and maximum probability remained elusive until the arrival of this book. By making the connection mathematically precise, Rissanen now ties up the loose ends of MDL theory and at the same time develops a beautiful, unified, entirely original and fully coherent theory of estimation, which includes hypothesis testing as a special case.' Peter Grünwald, Centrum voor Wiskunde en Informatica, The Netherlands

This book presents a comprehensive and consistent theory of estimation. The framework described leads naturally to a generalized maximum capacity estimator. This approach allows the optimal estimation of real-valued parameters, their number and intervals, as well as providing common ground for explaining the power of these estimators. Beginning with a review of coding and the key properties of information, the author goes on to discuss the techniques of estimation and develops the generalized maximum capacity estimator, based on a new form of Shannon's mutual information and channel capacity. Applications of this powerful technique in hypothesis testing and denoising are described in detail. Offering an original and thought-provoking perspective on estimation theory, Jorma Rissanen's book is of interest to graduate students and researchers in the fields of information theory, probability and statistics, econometrics and finance.

1. Introduction
2. Coding
3. Basics of information
4. Modeling problem
5. Other optimality properties
6. Interval estimation
7. Hypothesis testing
8. Denoising
9. Sequential models
Appendix A. Elements of algorithmic information
Appendix B. Universal prior for integers.

Subject Areas: Signal processing [UYS], Pattern recognition [UYQP], Machine learning [UYQM], Coding theory & cryptology [GPJ], Information theory [GPF]

View full details