5

Disclaimer: This is a homework problem

This is a problem from "Adaptive Filter Theory" by Haykin. Problem 2.10 (2nd edition).

Problem

A discrete-time stochastic process $\{x(n)\}$ that is real-valued consists of an AR process $\{u(n)\}$ and additive white noise process $\{v_2(n)\}$. The AR component is described by the difference equation

\begin{equation} \tag 1 u(n) + \sum_{k=1}^{M}a_k u(n-k) = v_1(n) \end{equation}

where $\{a_k\}$ are the set of AR parameters and $\{v_1(n)\}$ is a white noise process that is independent of $\{v_2(n)\}$. Show that $\{x(n)\}$ is an ARMA process described by

\begin{equation} \tag 2 x(n) = -\sum_{k=1}^{M}{a_k x(n-k)} + \sum_{k=1}^{M}{b_k e(n-k)} + e(n) \end{equation}

where $\{e(n)\}$ is a white noise process. How are the MA parameters $\{b_k\}$ defined? How is the variance of $e(n)$ defined?

Attempted Solution

Maybe there's a way to approach this in the time domain but I think I'm making some headway with this approach.

First, writing $x(n)$ in terms of the AR process and additive white noise.

\begin{equation} \tag 3 x(n) = u(n) + v_2(n) \end{equation}

Taking the Z-transform of (3) I get

\begin{equation} \tag 4 X(z) = U(z) + V_2(z) \end{equation}

Where I can express $U(z)$ using (1) as

\begin{equation} \tag 5 U(z) = \frac{V_1(z)}{1+\sum_{k=1}^{M}{a_k z^{-k}}} = \frac{V_1(z)}{A(z)} \end{equation}

So (4) becomes

\begin{equation} \tag 6 X(z) = \frac{V_1(z)}{A(z)} + V_2(z) \end{equation}

For the ARMA process described by (2) I'm looking for the following input output relation

\begin{equation} \tag 7 \frac{X(z)}{E(z)} = \frac{B(z)}{A(z)} \end{equation}

where $B(z) = 1+\sum_{k=1}^{M}{b_k z^{-k}}$.

Solving for $X(z)$ in (6) and relating it to (7) I get the following equation

\begin{equation} \tag 8 \frac{E(z)B(z)}{A(z)} = \frac{V_1(z)}{A(z)} + V_2(z) \end{equation}

I rearrange (8) and am left with the following equation

\begin{equation} \tag 9 B(z)E(z) = V_1(z) + A(z)V_2(z) \end{equation}

It seems in theory that I could use this relation to come up with values for $b_k, k=1,2,...,M$ but I don't know how to determine the variance of $e(n)$.

Its possible (likely) that I'm approaching this problem wrong. Does anyone have any insight into how one might go about solving this problem?

Edit I think I've figured out the answer to this question but it would be nice if somebody more qualified could verify my solution.

Continuing where I left off...

Take the inverse z-transform of $(9)$ then I get

\begin{equation} \tag {10} \sum_{k=0}^{M}{b_k e(n-k)}=v_1(n)+\sum_{k=0}^{M}a_kv_2(n-k) \end{equation}

Where $b_0 = a_0 = 1$

Taking the autocorrelation of both sides (square and take expectation) gives

\begin{equation} \tag {11} \mathbb{E} \left[ \left(\sum_{k=0}^{M}{b_k e(n-k)} \right)^2 \right] = \mathbb{E} \left[ \left( v_1(n)+\sum_{k=0}^{M}a_kv_2(n-k) \right)^2 \right] \end{equation}

The cross terms cancel because the noise is all uncorrelated so I get

\begin{equation} \tag {12} \sigma_e^2 \sum_{k=0}^{M}{b_k^2} = \sigma_1^2 + \sigma_2^2 \sum_{k=0}^{M}{a_k^2} \end{equation}

Where $\sigma_e^2 = \mathbb{E}[e(n)^2]$, and $\sigma_k^2 = \mathbb{E}[v_k(n)], k=1,2$.

As long as this relation in $(12)$ holds then I should be good right?

So I can choose to define $\sigma_e^2$ however I want, then define $b_k, k=1,...,M$ such that

\begin{equation} \tag {13} \sum_{k=0}^{M}{b_k^2} = \frac{\sigma_1^2 + \sigma_2^2 \sum_{k=0}^{M}{a_k^2}}{\sigma_e^2} \end{equation}

Alternatively, I could define $b_k, k=1,...,M$ however I want and compute $\sigma_e^2$ as

\begin{equation} \tag {14} \sigma_e^2 = \frac{\sigma_1^2 + \sigma_2^2 \sum_{k=0}^{M}{a_k^2}}{\sum_{k=0}^{M}{b_k^2}} \end{equation}

jodag
  • 333
  • 2
  • 9

0 Answers0