Skip to product information
1 of 1
Regular price £32.59 GBP
Regular price £37.99 GBP Sale price £32.59 GBP
Sale Sold out
Free UK Shipping

Freshly Printed - allow 4 days lead

Optimization for Data Analysis

A concise text that presents and analyzes the fundamental techniques and methods in optimization that are useful in data science.

Stephen J. Wright (Author), Benjamin Recht (Author)

9781316518984, Cambridge University Press

Hardback, published 21 April 2022

238 pages
23.5 x 15.6 x 1.6 cm, 0.45 kg

'This textbook is a much-needed exposition of optimization techniques, presented with conciseness and precision, with emphasis on topics most relevant for data science and machine learning applications. I imagine that this book will be immensely popular in university courses across the globe, and become a standard reference used by researchers in the area.' Amitabh Basu, Johns Hopkins University

Optimization techniques are at the core of data science, including data analysis and machine learning. An understanding of basic optimization techniques and their fundamental properties provides important grounding for students, researchers, and practitioners in these areas. This text covers the fundamentals of optimization algorithms in a compact, self-contained way, focusing on the techniques most relevant to data science. An introductory chapter demonstrates that many standard problems in data science can be formulated as optimization problems. Next, many fundamental methods in optimization are described and analyzed, including: gradient and accelerated gradient methods for unconstrained optimization of smooth (especially convex) functions; the stochastic gradient method, a workhorse algorithm in machine learning; the coordinate descent approach; several key algorithms for constrained optimization problems; algorithms for minimizing nonsmooth functions arising in data science; foundations of the analysis of nonsmooth functions and optimization duality; and the back-propagation approach, relevant to neural networks.

1. Introduction
2. Foundations of smooth optimization
3. Descent methods
4. Gradient methods using momentum
5. Stochastic gradient
6. Coordinate descent
7. First-order methods for constrained optimization
8. Nonsmooth functions and subgradients
9. Nonsmooth optimization methods
10. Duality and algorithms
11. Differentiation and adjoints.

Subject Areas: Machine learning [UYQM], Data capture & analysis [UNC], Maths for engineers [TBJ], Linear programming [PBUH], Optimization [PBU], Data analysis: general [GPH]

View full details