This text is a concise account of the fundamental principles of optimization theory blended in a judicious way with current research.
Using the powerful language of convex analysis it helps the reader to probe into such advanced topics like Non-smooth optimization and conjugate duality. The now traditional area of differentiable optimization has been revisited and many new insights have been provided by presenting very recent research in this area.
Table of Contents
• Introduction • A Little History • Definitions and Basic Facts • Conditions for a Minimum • Elements of Convex Analysis • Convex Sets and Separation Theorems • Polyhedral Convex Sets and Farkas lemma • Convex basic Properties and Generalization • Subdifferentials and Calculus Rules • Tangent and Normal Cones • Theorems of the Alternative • Karush-Kuhn-Tucker Conditions • Unconstrained Minimization • Fritz-John Conditios • Karush-Kuhn-Tucker Conditions • Generalized Convexity and Sufficiency • Equality Constraints • Convex Optimization • The Basic Problem • Convex Optimization with Inequality Constraints • Saddle Point Conditions • Convex Optimization with Mixed Constraints • Nonsmooth Optimization • Clarke Subdifferential and Related Results • Clarke Tangent and Normal Cones • Optimality Conditions in Lipschitz Optimization • Applications to Strict Minimization • Generalized Convexity and Nonsmoothness • Quasidifferentials and Optimality Conditions • Subdifferentials of Non-Lipschitz Some Ideas • Duality • The Value Function and Lagrangian Duality • Fenchel Duality • Fractional Programming Duality • Nonlinear Lagrangian and Nonconvex Duality • Monotone and Generalized Monotone Maps • Motivation • Convexity and Monotonicity • Subdifferential as a Monotone Map • Quasimonotone and Pseudomonotone maps