Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization. Chapters 1 and 2 focus on describing systems and evaluating their performances. Chapter 3 deals with dynamic programming. The calculus of variations and Pontryagin's minimum principle are the subjects of chapters 4 and 5, and chapter 6 examines iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, intended to introduce additional topics as well as to illustrate basic concepts, appear throughout the text.
This is a good book, but it does not explicitly provide intuition. It's a survey of the core ideas I guess. Definitely something that needs to be revisited as a cookbook, rather than something that provides long-lasting knowledge or insights that can be applied to other problem domains. It's on my list to re-read later and probably the best, most concise book for learning about optimal control problems.