We may first attempt to minimize the closed-loop MSE prediction error E[en2]=E(Xn−i=1∑maiX^n−i)2but this is too computationally complex as for large sequences each X^n−i will depend on previous ai in a highly linear fashion.
We may instead choose ai to minimize the open-loop error E(Xn−i=1∑maiXn−i)2so now we state the problem formally…
Problem
Given RVsXn−m,Xn−m+1,…,Xn−1,Xn (all have finite variance), determine their predictor coefficients, ai by minimizing our prediction errorminE(Xn−i=1∑maiXn−i)2 # Solution Define g(a1,…,am)=E(Xn−i=1∑maiXn−i)2We must find a1,…,amargming(a1,…,am)Since E(Xn−i=1∑maiXn−i)2=E[Xn2]+i=1∑mk=1∑maiakE[Xn−iXn−k]−2i=1∑maiE[XnXn−i]now we derive with respect to aj∂aj∂g=2k=1∑makE[Xn−jXn−k]−2E[XnXn−j] Setting ∂aj∂g=0∀j gives m linear equations (note this works because distortion is convex). ## System of equations k=1∑makE[Xn−jXn−k]=E[XnXn−j],j=1,…,m(*)Let rik=E[Xn−iXn−k] and R=r11r21⋮rm1r12r22⋮rm2……⋱…r1mr2m⋮rmmv=r01r02⋮r0mFrom (∗) the optimal a=(a1,…,am)T must solve Ra=vSolution always exists but is only unique if R is invertiblea=R−1v ## Remark The autocorrelation matrix R is symmetric (i.e. rij=rji) It is also Positive Semidefinite (or definite) i.e. xTRx≥0∀x=(x1,…,xm)T