This textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory, and is a self-contained resource for graduate students in engineering, applied mathematics, and related subjects. Designed specifically for a one-semester course, the book begins with calculus of variations, preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming and linear-quadratic optimal control.
Calculus of Variations and Optimal Control Theory also traces the historical development of the subject and features numerous exercises, notes and references at the end of each chapter, and suggestions for further study.
I have been using this as my primary reference for Optimal Control Theory for a couple of years now. I like the fact that it builds up gently from the first principles of Calculus generally introduced in undergrad Calculus Courses. It does not trivially dismiss proofs. It clearly states all the assumptions involved cover even the minute details often overlooked while presenting results. The chapter on the Maximum Principle is particularly detailed and has been quite helpful to me as a grad. student.
All chapters are logically connected. You will see that optimality in general functional space is an extension of optimality in euclidian space calculus.