Given is one single datapoint $\vec{x}$, which contains the same feature twice, hence $\vec{x} = (x,x)$ and a label y. We have to add the regularization term $\lambda*||\vec{\beta}||^2$, $\lambda > 0$ to the regular linear regression problem, solve this problem, as well as explain what relationship the components of the solution $\vec{\beta^*}$ satisfy. I would use the following equation to solve the problem by hand: $(X^TX)*\vec{\beta} - \lambda*\vec{\beta}= X^Ty$, specifically for our problem:
$\begin{bmatrix} x^2 & x^2\\ x^2& x^2\end{bmatrix} *\begin{bmatrix}\beta_1 \\ \beta_2 \end{bmatrix}+\begin{bmatrix}\lambda\beta_1 \\ \lambda\beta_2 \end{bmatrix} = \begin{bmatrix}xy \\ xy \end{bmatrix} $
Is this correct so far? I know for ridge regression, the closed form solution always exist (the added regularization allows for invertibility), but I am not sure what I can conclude for the solution and the components in this particular case.