Our course covers two domains in the field of control theory - optimal control and robust control. Both domains are so wide and deep that it turns out impossible to name a single book that would cover both of them. The more so that our preference in this course is to provide a broad overview of the concepts, methods and tools, rather than to dig deep into in a narrow segment.

Moreover, in our course we will need at least the basics from one discipline of applied (numerical) mathematics - (numerical) optimization. Hence, we have resigned at building this course on a single (text)book.

For each of these three domains there are dozens of high-quality textbooks available on the market and online texts available for download. We will therefore only give reading assignments from online resources and recommend literature for further optional reading once we study the individual (weekly) topics.

Note, however, that for the last part of the course during which we discuss robust control, our choice of a recommended literature is confident (see [8] in the literature for robust control). Several copies are reserved for the students of this course in the university library. This is the only book that we strictly require in this course (for a part of the course).

Anyway, below we give some general recommendation on literature categorized into the three domains. Students are not required to obtain those books but perhaps such commented lists might do some service to interested students.

Numerical optimization

In this course we will use this discipline to invoke the basic concepts and tools. We will not follow any single textbook offering a full course in numerical optimization. Therefore we do not even rely on students having an access to a particular book. Should the interested students need some recommendation for their further tudy, below is a list of our popular books on the topic. In particular, [6] offers lucid explanations and [7] is fairly comprehensive yet readable. The freely available [2] is now regarded a "convex optimization bible". There are a few recently published textbooks that are also legally available online and seem fairly comprehensive and readable: [8] and [9], the latter including code in Julia language.

[1]
D. P. Bertsekas, Nonlinear Programming, 3nd edition. Belmont, Mass.: Athena Scientific, 2016.
[2]
S. Boyd and L. Vandenberghe, Convex Optimization. Cambridge, UK ; New York: Cambridge University Press, 2004. Available [online].
[3]
J. Brinkhuis and V. Tikhomirov, Optimization: Insights and Applications. Princeton, N.J: Princeton University Press, 2005.
[4]
R. Fletcher, Practical Methods of Optimization, 2nd edition. Chichester; New York: Wiley, 2000.
[5]
L. E. Ghaoui, Optimization Models. Cambridge University Press, 2014. [online version]
[6]
D. G. Luenberger and Y. Ye, Linear and Nonlinear Programming, 3rd edition. New York, NY: Springer, 2008.
[7]
J. Nocedal and S. Wright, Numerical Optimization, 2nd edition. New York: Springer, 2006.
[8]
J. R. R. A. Martins and A. Ning. Engineering Design Optimization. Draft, February 2021. Downloadable at http://flowlab.groups.et.byu.net/mdobook.pdf.
[9] M. J. Kochfenderfer and T. A. Wheeler, Algorithms for Optimization. The MIT Press, 2019. Available: https://algorithmsbook.com/optimization/


Optimal control (and calculus of variations)

In the first half of our course we are going to deal with fundamentals of optimal control. For discrete-time systems we can invoke the concepts and tools from the discipline of numerical optimization but for continuous-time systems we need to invoke something else - calculus of variations. Hence our desired book should contain a fair discussion of calculus of variations. Our earlier choice used to be [7] (in the previous edition) in the list below, then we switched towards [8] (mainly because its draft is freely available on the author's webpage while [7] is ridiculously expensive (but lately also made available online on the author's web)). While [3] is regarded classic, [5] is much more readable. A few copies of [1] are available in the library. The recently published [6] really stands up to its title - it is very intuitive.

However, neither book contained the right blend of topics for our course (and the rigour corresponding to our students' background). Therefore, we will not follow any single textbook but encourage the students to get any of the books listed below; mapping the individual lectures to the chapters in a book will be easy because the material is now already standard.

[1]
B. D. O. Anderson and J. B. Moore, Optimal Control: Linear Quadratic Methods. Dover Publications, 2007.
[2]
M. Athans and P. L. Falb, Optimal Control: An Introduction to the Theory and Its Applications. Dover Publications, 2006.
[3]
A. E. Bryson Jr. and Y.-C. Ho, Applied Optimal Control: Optimization, Estimation and Control, Revised edition. CRC Press, 1975.
[4]
I. M. Gelfand and S. V. Fomin, Calculus of Variations. Mineola, N.Y: Dover Publications, 2000.
[5]
D. E. Kirk, Optimal Control Theory: An Introduction. Dover Publications, 2004.
[6]
M. Levi, Classical Mechanics With Calculus of Variations and Optimal Control: An Intuitive Introduction. Providence, Rhode Island : University Park, Pennsylvania : Mathematics Advanced Study Semesters: American Mathematical Society, 2014.
[7]
F. Lewis, D. Vrabie, and V. L. Syrmos, Optimal Control, 3rd edition. Hoboken: Wiley, 2012. [freely online on the author's web page]
[8]
D. Liberzon, Calculus of Variations and Optimal Control Theory: A Concise Introduction. Princeton University Press, 2011. Draft available [online].
[9]
C. R. MacCluer, Calculus of Variations: Mechanics, Control and Other Applications, Reprint edition. Dover Publications, 2012.


Robust control

Although somewhat narrow in scope, the topic of robust control is described in a wealth of books. Nonetheless, here our choice is fairly confident. In the second half of the course we are going to use [8] heavily. A few copies of this book are available in the university library (reserved for the students of this course). But we strongly recommend considering purchasing the book. It might turn out very useful as a reference even after passing an exam in this course. Below we give a list of some other relevant and popular books that might be useful for further studies. But we are not going to work with these books in this course.

[1]
S. P. Bhattacharyya, H. Chapellat, L. H. Keel, and L. H. Keel, Robust Control: The Parametric Approach, Har/Dis. Prentice Hall, 1995.
[2]
M. A. Dahleh and I. J. Diaz-Bobillo, Control of Uncertain Systems: A Linear Programming Approach, 1st ed. Prentice Hall, 1995.
[3]
G. E. Dullerud and F. Paganini, A Course in Robust Control Theory: A Convex Approach. Springer, 2010.
[4]
B. A. Francis, A Course in H∞ Control Theory. Springer, 1987.
[5]
M. Green, D. J. N. Limebeer, and Engineering, Linear Robust Control, Reprint edition. Mineola, N.Y: Dover Publications, 2012.
[6]
J. W. Helton and O. Merino, Classical Control Using H-Infinity Methods: Theory, Optimization and Design. Society for Industrial and Applied Mathematics, 1998.
[7]
R. S. Sánchez-Peña and M. Sznaier, Robust Systems Theory and Applications, 1st ed. Wiley-Interscience, 1998.
[8]
S. Skogestad and I. Postlethwaite, Multivariable Feedback Control: Analysis and Design, 2nd ed. Wiley, 2005. The first three chapters are freely available [online].
[9]
K. Zhou, J. C. Doyle, and K. Glover, Robust and Optimal Control, 1st ed. Prentice Hall, 1995.
[10]
K. Zhou and J. C. Doyle, Essentials of Robust Control, 1st ed. Prentice Hall, 1997.
Last modified: Sunday, 13 February 2022, 10:38 PM