Sequentially decoupling estimators for Box-Jenkins model estimation
A two-stage method that works in both open and closed loops...
In this paper, Biqiang Mu introduces a novel estimation method for Box-Jenkins (BJ) models that is both consistent and asymptotically efficient under standard open‑loop and closed‑loop data conditions. The method consists of two stages: first, a Sequentially Decoupling (SD) estimator built from three sequential least squares (LS) estimators; second, a Gauss‑Newton (GN) refinement step. The SD estimator first fits a high‑order ARX model, then estimates the BJ model's dynamic part via an auxiliary output‑error (OE) model, and finally estimates the noise model using another auxiliary OE model. Consistency is established under standard regularity conditions by leveraging the consistency of the underlying LS estimators for ARX and OE models.
A key theoretical result shows that a single Gauss‑Newton iteration starting from the SD estimator produces an estimator asymptotically equivalent to the prediction error method, provided the ARX model order satisfies a mild growth condition. This means practitioners can obtain near‑optimal estimates without iterative optimization. Simulation studies validate the method's performance, demonstrating that the SD estimator plus one GN step matches the efficiency of more complex approaches. This work provides a computationally attractive alternative to the weighted null‑space fitting method, especially valuable for control systems and signal processing applications where BJ models are widely used.
- The SD estimator uses three sequential least squares steps: high-order ARX, auxiliary OE for dynamics, and auxiliary OE for noise.
- Consistency is proven under standard regularity conditions for both open- and closed-loop data.
- One Gauss-Newton iteration yields an estimator asymptotically equivalent to the prediction error method.
Why It Matters
Offers a practical, consistent, and efficient alternative for Box-Jenkins system identification in control and signal processing.