Skip to main content
Facilities Mobile homeCourses home
Detail

Optimal Control

MAE 546

1252
Info tab content
This course covers the main principles of optimal control theory applied to deterministic continuous-time problems and provide guidance on numerical methods for their solution. Fundamental results are reached starting with parameter optimization, the calculus of variations, and finally Pontryagin¿s principle(s), dynamic programming and the Hamilton-Jacobi-Bellman equation. Geometric and analytic properties of the formulations and solutions are highlighted. Numerical methods for direct and indirect optimal control problems are covered with applications. Emphasis is placed on intuition between the various aspects of the course.
Instructors tab content
Sections tab content

Section L01