For a linear regression $Y = X\beta + \varepsilon$ with $\varepsilon \sim \mathcal N(0,\sigma^2 I)$, we have $\hat Y = H Y$ for $H = X(X^TX)^{-1}X^T$. This means that $Var(Y - \hat Y) = \sigma^2(I-H)$ so in particular $Var(Y_i - \hat Y_i) = \sigma^2(1-h_i)$.
Suppose my predictor matrix $X$ has rows $x_1,\dots,x_n\in\mathbb R^p$. I want to bound this residual variance, and therefore $1-h_i$, in terms of $\|x_i - \frac 1n\mathbf 1^TX\|^2$, the distance of $x_i$ to the mean of the $x_i$. Is there something nice I can do here?
If this is a simple linear regression I know $$ 1 - h_i = 1 - \frac 1n - \frac{(x_i - \bar x)^2}{\sum_j (x_j - \bar x)^2} $$ so I am handed such a bound via the appearance of the $(x_i - \bar x)^2$ term. But what about in a multiple regression?