Skip to product information
1 of 1
Regular price £56.99 GBP
Regular price £70.99 GBP Sale price £56.99 GBP
Sale Sold out
Free UK Shipping

Freshly Printed - allow 8 days lead

Learning Theory
An Approximation Theory Viewpoint

A general overview of theoretical foundations; the first book to emphasize the approximation theory viewpoint.

Felipe Cucker (Author), Ding Xuan Zhou (Author)

9780521865593, Cambridge University Press

Hardback, published 29 March 2007

238 pages, 20 b/w illus.
23.1 x 16 x 1.7 cm, 0.36 kg

'… the book under review focuses on the mathematical foundations of learning theory. It is an excellent monograph on the subject. A major novelty is the focus on the point of view of approximation. This distinguishes the book from the majority of previous works on learning theory, which share a prevalent statistics/computer science flavor. However, this doesn't mean at all that the monograph is written only for 'approximation people'. On the contrary, it nicely provides a general overview of the theoretical foundations of the subject also to a broad spectrum of researchers in learning and related fields.' Mathematical Reviews

The goal of learning theory is to approximate a function from sample values. To attain this goal learning theory draws on a variety of diverse subjects, specifically statistics, approximation theory, and algorithmics. Ideas from all these areas blended to form a subject whose many successful applications have triggered a rapid growth during the last two decades. This is the first book to give a general overview of the theoretical foundations of the subject emphasizing the approximation theory, while still giving a balanced overview. It is based on courses taught by the authors, and is reasonably self-contained so will appeal to a broad spectrum of researchers in learning theory and adjacent fields. It will also serve as an introduction for graduate students and others entering the field, who wish to see how the problems raised in learning theory relate to other disciplines.

Preface
Foreword
1. The framework of learning
2. Basic hypothesis spaces
3. Estimating the sample error
4. Polynomial decay approximation error
5. Estimating covering numbers
6. Logarithmic decay approximation error
7. On the bias-variance problem
8. Regularization
9. Support vector machines for classification
10. General regularized classifiers
Bibliography
Index.

Subject Areas: Mathematical theory of computation [UYA], Algorithms & data structures [UMB], Probability & statistics [PBT]

View full details