Adaptive Critic Control with Robust Stabilization for Uncertain Nonlinear Systems
Uncertainty and nonlinearity are involved in all walks of life. Every living organism
in the nature interacts with its environment and improves its own actions to survive
and increase. However, due to the limitation of various resources, most organisms
act in an optimal fashion in order to conserve resources yet achieve their goals.
Hence, obtaining optimal actions to minimize consumption or maximize reward,
i.e., the idea of optimization, is necessary and significant. In general, the optimal
control of nonlinear systems often requires solving the nonlinear Hamilton–Jacobi–
Bellman (HJB) equation, which is different from that of linear systems.
Therefore, the nonlinear optimal control design with dynamic uncertainties is a difficult
and hallenging area.
This book reports on the latest advances in adaptive critic control with robust stabilization
for uncertain nonlinear systems.
Covering the core theory, novel methods, and a number of typical industrial applications
related to the robust adaptive critic control field, it develops a comprehensive framework
of robust adaptive strategies, including theoretical analysis, algorithm design, simulation
verification, and experimental results. As such, it is of interest to university researchers,
graduate students, and engineers in the fields of automation, computer science, and
electrical engineering wishing to learn about the fundamental principles, methods, algorithms,
and applications in the field of robust adaptive critic control. In addition, it promotes the
development of robust adaptive critic control approaches, and the construction of higher-level
Nonlinear Adaptive Control Robust