This book provides an elementary but rigorous presentation of the optimal control theory. In thefirst chapter, the brief history and the foundations of the optimal control theory are presented. The clas-sical variational methods are presented in the second chapter. The minimum principle and its applica-tions are the substance matter of the third chapter. The fourth chapter presents the dynamical program-ming method. In the final chapter,the solutions of the basic form of the linear-quadratic problem with fi-nite and infinite horizon are presented.This book can be used as a textbook or reference book for the undergraduate students of appliedmathematics and control engineers. It can also be used as a reference for scholars and engineers in thefield of optimal control theory.