文摘
This paper discusses model order reduction of discrete-time linear time-delayed systems over limited frequency interval. Firstly, a finite-frequency index is introduced to characterize the desired approximation performance over the pre-specified frequency interval. By exploiting the finite-frequency analysis results for linear delay systems, sufficient criterions for guaranteeing stability of the reduced-order model and optimizing the finite-frequency approximation error are derived with the aid of matrix inequality techniques. The finite-frequency model order reduction problem then is converted to a LMI-based optimization problem, which can be solved easily. Finally, a numerical example is given to illustrate the effectiveness of the proposed results. Keywords Model order reduction Finite frequency Discrete-time linear time-delayed systems Linear matrix inequality (LMI)