So I'm trying to show that regressing y on x in this case (fixed effects model):
$y_{it} = \beta x_{it} + \theta_{i} + \epsilon_{it}$
is the same as this regression:
$y_{it} - \bar{y_{l}} = \beta(x_{it} - \bar{x_{l}}) + \epsilon_{it}$
So I'm trying to show that regressing y on x in this case (fixed effects model):
$y_{it} = \beta x_{it} + \theta_{i} + \epsilon_{it}$
is the same as this regression:
$y_{it} - \bar{y_{l}} = \beta(x_{it} - \bar{x_{l}}) + \epsilon_{it}$
You can rewrite your first equation as $\bar{y}_i = \beta \bar{x}_i + \theta_i + \bar{\epsilon}_i $. Moreover, notice that $\bar{\epsilon}_i = 0$, because you have $\theta_i$ in the model.
So that the only thing you have to do is to subtract this equation from your first equation so you can reach your second equation. I hope it helped.
An alternative answer would use the Frisch Waugh Lovell theorem (this is what the question provided you in parenthesis).
1st you regress y and x on all $\theta_i$. The residual terms from these two regressions are the demeaned variables $y_i - \bar{y}$ and $x_i - \bar{x}$ (It is a standard result in OLS regression). 2nd you regress these residuals each other. I.e. regress $y_i - \bar{y}$ on $x_i - \bar{x}$. The Frisch Waugh Lovell theorem states that the $\hat{\beta}$ from this regression is numerically identical to the original regression.