Image for Continuous time dynamical systems: state estimation and optimal control with orthogonal functions

Continuous time dynamical systems: state estimation and optimal control with orthogonal functions (1st edition.)

See all formats and editions

Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved.

An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional.This book, Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions, considers different classes of systems with quadratic performance criteria.

It then attempts to find the optimal control law for each class of systems using orthogonal functions that can optimize the given performance criteria.

Illustrated throughout with detailed examples, the book covers topics including: Block-pulse functions and shifted Legendre polynomialsState estimation of linear time-invariant systemsLinear optimal control systems incorporating observersOptimal control of systems described by integro-differential equationsLinear-quadratic-Gaussian controlOptimal control of singular systemsOptimal control of time-delay systems with and without reverse time termsOptimal control of second-order nonlinear systemsHierarchical control of linear time-invariant and time-varying systems

Read More
Special order line: only available to educational & business accounts. Sign In
£215.00
Product Details
CRC Press
1351832239 / 9781351832236
eBook (EPUB)
515.55
08/10/2018
English
247 pages
Copy: 30%; print: 30%