Does linear discrete-time controllability imply stabilizability

$\begingroup$

Does linear discrete-time controllability imply stabilizability? I feel like it should, since controllability is the ability to steer from any state $x(0)$ to another state $x(1)$ in finite time and stabilizability is the ability to steer a system to zero, with a control input that can be defined over an infinite amount of time. I would say that if a system $x(k+1) = Ax(k) + bu(k)$ is controllable, you can take $x(1)$ equal to 0 and you have stabilizability. However if I google this, I cannot find a clear answer and I'm questioning myself.

$\endgroup$

1 Answer

$\begingroup$

Conceptually, controllability and stabilizability are different things. Controllability is an open-loop property of the system and gives the limit of the control input effect on the state response (see also my answer here). Stabilizability is the existence of a static state feedback controller that stabilizes the system.

It turns out in LTI systems if the system is controllable then it is also stabilizable. But just taking $x(1)=0$ is not the way to prove this statement. It would only show the existence of a stabilizing control signal, but not a feedback controller. The actual proof is more involved and not intuitive or obvious (to me at least).

Having said that, you can probably find some "definitions" out there such as: "stabilizability is defined as the stability of the uncontrollable nodes". This is true for LTI systems but not in general. In my opinion it shouldn't be a definition since it can actually be proven from more general concepts. Also, it doesn't give the intuition that a definition should give.

$\endgroup$

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like