Stability of linear systems in state space and in the input-output sense. The Routh test for the stability of a polynomial. The concept of feedback and its importance. The standard feedback connection of two linear systems, with an algebraic stability test. Bode and Nyquist plots, the Nyquist theorem. Steady state and transient response of a stable system, the behavior of second order systems. PI and PD controllers, examples with DC motor, anti wind-up, Ziegler-Nichols tuning rules. Root locus plots. The concepts of gain margin and phase margin. Design of cont-rollers using the concepts of sensitivity, loop gain, steady state tracking error, bandwidth, disturbance attenuation. Lead-lag compensators, the use of Nichols charts in controller design. The control of non-minimum phase systems. Analysis of feedback systems in the state space. The concepts of controllability, observability, stabilizability and detectability, the Kalman and Hautus tests for checking if these properties hold. Stabilization by state feedback, pole placement. Some elements of linear quadratic optimal control (Riccati equations, synthesis of an optimal state feedback). Observers and dynamic feedback using observers. The separation principle for designing a controller based on an observer.