Control Theory Overview

Jul 28, 2024

Control Theory and Autonomous Systems

Introduction

  • Goal: Understand how to control autonomous systems (e.g., cars, buildings, distillation columns).
  • Control Theory: A mathematical framework for developing these systems.

Basic Concepts

  • Dynamical System: The system we want to control.
    • Can be anything (car, building, distillation column).
    • Affected by:
      • Control Inputs (U): Intentional actions (e.g., steering, braking).
      • Disturbances (D): Unintentional effects (e.g., wind, bumps).
  • State (X): Changes over time based on inputs and dynamics.

Open Loop Control

  • Feed Forward Control: Algorithm generates control signals based only on desired outcomes (reference R) without measuring current state.
    • Example: Steering straight and maintaining speed.
    • Limitations: Requires thorough knowledge of the system dynamics and predictable environment.

Feedback Control

  • Closed Loop Control: Uses both reference and current state to adjust control inputs.
    • Self-correcting mechanism: If the state deviates, it recognizes and adjusts.
    • Important to understand system dynamics as it can change stability.
    • Types of feedback controllers:
      • Linear Controllers: PID, Full State Feedback.
      • Non-linear Controllers: On-off, Sliding Mode, Gain Scheduling.
      • Robust Controllers: handle uncertainty.
      • Adaptive Controllers: adjust to changes in the system.
      • Optimal Controllers: minimize cost functions.
      • Predictive Controllers: forecast future states to optimize control inputs.
      • Intelligent Controllers: Lean on data for better control (e.g., fuzzy logic, reinforcement learning).

Planning

  • Essential step in designing control systems.
    • Determines path to destination, avoids obstacles, adheres to rules, considers Comfort and safety.
    • Examples of planning algorithms: Rapidly expanding random trees (RRT), A*.

State Measurement and Observability

  • Measurement Noise: Affects accurate state representation.
  • Observability: Requires sufficient sensor placement to observe every state (e.g., using speedometer to derive acceleration).
  • State estimation techniques: Kalman filter, particle filter, moving averages.

System Analysis

  • Ensures system meets requirements through:
    • Stability checking: Body diagrams, Nyquist diagrams.
    • Simulations: Matlab, Simulink.

Conclusion

  • All parts of control theory interlinked: Controller design, state estimation, planning, analysis.
  • Models: Crucial in all aspects of control theory.
  • Additional resources available for further learning on each topic mentioned.

Call to Action

  • Subscribe for more tech talks and control system lectures.