Skip to product information
1 of 1
Regular price £52.89 GBP
Regular price £64.99 GBP Sale price £52.89 GBP
Sale Sold out
Free UK Shipping

Freshly Printed - allow 4 days lead

Random Matrix Methods for Machine Learning

This unified random matrix approach to large-dimensional machine learning covers applications from power detection to deep neural networks.

Romain Couillet (Author), Zhenyu Liao (Author)

9781009123235, Cambridge University Press

Hardback, published 21 July 2022

408 pages
25.1 x 17.4 x 2.3 cm, 0.87 kg

'This is a very timely and important book. Romain Couillet and Zhenyu Liao provide a great entry point into active, recent research on the applications of Random Matrix Theory as it pertains to high-dimensional statistics and analysis of machine learning algorithms. RMT was born in statistics with Wishart and later became, via Wigner, a great pillar of quantum and statistical before being recently pushed by mathematicians to deeper universality results. It is quite fitting that it now comes back to the modern problems and methods of statistics with this very well-organized and carefully written book by two leading experts.' Gérard Ben Arous, Courant Institute of Mathematical Sciences, New York University

This book presents a unified theory of random matrices for applications in machine learning, offering a large-dimensional data vision that exploits concentration and universality phenomena. This enables a precise understanding, and possible improvements, of the core mechanisms at play in real-world machine learning algorithms. The book opens with a thorough introduction to the theoretical basics of random matrices, which serves as a support to a wide scope of applications ranging from SVMs, through semi-supervised learning, unsupervised spectral clustering, and graph methods, to neural networks and deep learning. For each application, the authors discuss small- versus large-dimensional intuitions of the problem, followed by a systematic random matrix analysis of the resulting performance and possible improvements. All concepts, applications, and variations are illustrated numerically on synthetic as well as real-world data, with MATLAB and Python code provided on the accompanying website.

Preface
1. Introduction
2. Random matrix theory
3. Statistical inference in Linear Models
4. Kernel methods
5. Large neural networks
6. Large dimensional convex optimization
7. Community detection on graphs
8. Universality and real data
Bibliography
Index.

Subject Areas: Signal processing [UYS], Machine learning [UYQM], Data capture & analysis [UNC], Probability & statistics [PBT]

View full details