Recursive Least Squares and Multi-innovation Stochastic Gradient Parameter Estimation Methods for Signal Modeling
Citations
234 citations
195 citations
Additional excerpts
...Obviously, if there exists the departure 𝛿Lk from the filtering gain vector to the optimal gain vector Lk, the estimation error covariance matrix obtained from (15) will deviate from the minimal Pk+1 and reaches Pk+1 + 𝛿Pk+1, where 𝛿Pk+1 is the nonnegative definite matrix....
[...]
...Substituting (15) into (16) gives δPk+1 = (A − Lkc − δLkc)Pk ( AT − cLk + B uk − cδLk ) + BukPk ( AT − cLk + B uk ) − BukPkcδLk + Rw + LkRvL T k + LkRvδL T k + δLkRvL T k + δLkRvδL T k − Pk+1 = −δLk ( cPkA − cPkcLk + cPkB uk − RvLk ) − ( cPkA − cPkcLk + cPkB uk − RvLk )T δLk + δLk(cPkc T + Rv)δLk = W k + Wk + δLk(cPkc T + Rv)δLk , (17) where W k ∶= −δLk(cPkA − cPkcLk + cPkB uk − RvLk )....
[...]
...From (15), we find that Lk + δLk and Pk+1 + δPk+1 satisfy Pk+1 + δPk+1 = [A − (Lk + δLk)c]Pk[A − c(Lk + δLk) + Buk] + BukPk[A − c(Lk + δLk) + Buk] + Rw + (Lk + δLk)Rv(Lk + δLk), (16)...
[...]
190 citations
176 citations
160 citations
Cites background from "Recursive Least Squares and Multi-i..."
...If k = Le, terminate the recursive process and obtain the characteristic parameters from equation (24); otherwise, increase k by 1 and go to Step (2)....
[...]
...If k = Le, terminate the recursive process and obtain the characteristic parameters from equation (24); otherwise, increase k by 1 and go to Step (2)....
[...]
...If k = Le, terminate the recursive process; otherwise, k :1⁄4 k + 1 and go to Step (2)....
[...]
...Increase k by 1 and go to Step (2)....
[...]
...If k = Le, terminate the recursive process; otherwise, k :¼ k + 1 and go to Step (2)....
[...]
References
224 citations
222 citations
205 citations
205 citations
153 citations