optimal control
Optimal control refers to a mathematical discipline that seeks to find the best strategy or action to be taken over a given time period, considering a defined objective function and specific constraints.
Requires login.
Related Concepts (2)
Similar Concepts
- control and feedback
- control system optimization
- discrete-time control
- dynamics and control
- fuzzy control
- lyapunov stability in optimal control
- multi-variable control
- multivariable control
- optimal control of chaotic systems
- optimization
- robotic control
- robust control
- state-space control
- stochastic control
- stochastic control theory