I have a problem which is similar to linear regression, but differs in two main points: 1) the number of regressors is equal to the number of observations and 2) I have constraints on the regressors.
Specifically, I have a system of linear equations
$$\bf{X} \cdot \bf{\beta} = \bf{y}$$
where $\bf{X}$ is a full rank $N \times N$ matrix, and $\bf{\beta}$ and $\bf{y}$ are column vectors of $N$ components. I also have constraints on the components of $\bf{\beta}$:
$$\beta_i \ge 0.$$
After solving for $\beta$ (which I do by constrained least squares), how do I test for significance of the model? I'd like to have something equivalent to the F-test as in linear regression. Note that I cannot use the F-test directly, as the number of coefficients (components of $\bf{\beta}$) is equal to the number of observations (the rank of $\bf{X}$), so I'd get a zero in the denominator of F (the second degrees of freedom is zero). I don't really care for the significance of the individual $\beta_i$'s.
Any ideas? I have checked regression with constraints, How can I add minimum and maximum constraints to a coefficient in a regression in R?, and other questions suggested by Cross Validated, but they don't go into the question of p-values.