Full description not available
R**N
Boyd is a wonderful teacher
As you begin your journey into machine learning, understand that the algorithms that you use stand on very firm ground, mostly in optimization. Most optimization problems can in general be thought of as solution finding in some Rn. Now turns out in certain classes of optimization, we can find some form of global optimum, and this class is the class of convex sets.Optimization is used everywhere, and all of us have used it already. A simple thing like f'(x) =0 is something that we have already used from school days. But the problem is twofold. Recognizing that a function in convex, for a general f, and then trying to solve f' = 0 in cases where f is not conducive to exact solutions.--This is a great first book for someone looking to enter the world of machine learning through optimization. This is another approach apart the statistical side (which is well covered in ESL by Hastie and Tibshiriani). Several fundamental things in machine learning like SVMs and gradient descent are based on the concepts learnt from optimization.One of the strengths of this book is that it doesn't jump into the solution methods or how to solve such problems until the problem is well understood. Spending more time in the problem space than the solution space let's the reader know why and when to apply the solutions suggested. The algorithms form the third (and last) part of the book. In a book like Fletcher, newtons method would be in the first chapter after the introduction. This approach too aids the new entrant to the field.I find this book a slightly easier read than the one by Bersetkas. That could be a good second book, before you move on to other topics based on your interest. If you are interested in finding solutions in Rn for general cases of f (say non convex), core optimization books like Luenberger or Fletcher may be recommendable, especially for numerical optimization enthusiasts. But these are only appreciated after a first pass through the subject. Nocedal and Wright is a wonderful book for someone with exposure to optimization.--As always I find the prints from Cambridge Press extremely readable and a pleasure to hold. I would recommend buying this book even though you may get online prints.--Problems in this book are hard. You may have to struggle a bit to solve the problems completely. This might affect your choice of whether to use this book as a textbook for convex optimization.--*Important*: Supplement the book by the highly recommended set of video lectures by the same Author (Boyd) on convex optimization available online. His conversational tone, and casual dropping of profound statements makes the video lectures some of the best I have seen.--Prerequisites: To appreciate the book, you need to have understood linear algebra (say atleast at level of Strang) as well as calculus (Joydeep Dutta recommended for this)--Overall:Recommended book in the library of every machine learning enthusiast. Not a must have, but almost there.
A**N
Excellent book. Great reference for management science and numerical engineering !!!
Classic book based on the iconic course taught at Stanford EE. Delivered in perfect condition. Thanks Amazon for the excellent service !!!
M**H
very bad paper and printing quality from
paper quality too bad . should be of rs 500 for the qualitygot it for 2500 from campus book house
Trustpilot
1 month ago
1 month ago