Suppose $\mathbf{Y}$ is a $n$-dimensional random vector, $A$ is a fixed $r \times n$ matrix, and $b$ is a fixed vector in $\mathbb{R}^n$. I have proven already that $$\mathbb{E}\left[A\mathbf{Y}+b\right] = A\mathbb{E}\left[\mathbf{Y}\right]+b\text{, }$$ $$\mathbb{E}\left[A\mathbf{Y}b\right] = A\mathbb{E}[\mathbf{Y}]b\text{,} $$ and $$\text{Cov}\left(A\mathbf{Y}+b\right) = A\text{Cov}\left(\mathbf{Y}\right)A^{\prime}\text{,}$$ where $\text{Cov}(\mathbf{Y})$ denotes the (variance-)covariance matrix of $\mathbf{Y}$.
The book says that using $\text{Cov}\left(A\mathbf{Y}+b\right) = A\text{Cov}\left(\mathbf{Y}\right)A^{\prime}$, we can show that $\text{Cov}\left(\mathbf{Y}\right)$ is nonnegative definite for any random vector $\mathbf{Y}$.
So we have for $v \in \mathbb{R}^n$, $$v^{\prime} \text{Cov}\left(\mathbf{Y}\right) v = \text{Cov}\left(v^{\prime}\mathbf{Y}\right)\text{.}$$ Just looking at the matrix dimensions, I know that $v^{\prime}\mathbf{Y}$ is a $1 \times 1$ summation of terms. In particular, $$v^{\prime}\mathbf{Y} = \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix}\begin{bmatrix} y_{1} \\ y_2 \\ \vdots \\ y_n\end{bmatrix} = \sum\limits_{i=1}^{n}y_iv_i\text{.}$$ Just given my machinery above, I'm not sure how to find $\text{Cov}\left(\sum\limits_{i=1}^{n}y_iv_i\right)$ without it being very messy.
Note that I already asked a very similar question here, but I would like to know what the author of the book is thinking by suggesting using $\text{Cov}\left(A\mathbf{Y}+b\right) = A\text{Cov}\left(\mathbf{Y}\right)A^{\prime}$.