I'm looking for assistance in understanding/implementating the following paper Covariance Matrix Estimation in Time Series Where I need help is Eq 33
Assume $EX_i = 0$. Using the idea of lag window spectral density estimate, we estimate the covariance matrix $\Sigma_n = var(Sn)$ of the sum $S_n = \sum_{i=1} X_i$ by
$\Sigma_n = \sum_{1<i,j<n} K\left( \frac{i-j}{B_n} \right) X_i X_j^T$
K is a window function, K(0)=1, K(u)=0 for |u|>1. What I don't understand is what $B_n$ is the lag sequence satisfying $B_n \to \infty$ and $B_n/n \to 0$ means? What I assume this is doing is effectively applying a weighted sum of the variance and covariances but it's not clear to me how to implement in practice. I'm looking for an explanation which would allow implementation for a scalar $X_i$ or references.