Can we solve this non-linear model as a constrained linear least squares problem?
In effect, yes, at least if you just needed parameter estimates -- though arguably it would have to be a strict inequality for the nonlinear version, since that can't actually reach equality. Note that it's often easier to find software to do nonlinear least squares regression than constrained regression. On the other hand the constrained one may converge where the nls one may have difficulties.
Here's a pair of examples using least squares plus your NLS formulation and a constrained (non-negative) least squares fit (via lm
, nls
and nnls::nnls
in R). Note however that you can actually impose a non-negativity constraint using the nls
package so you don't really need to get any new packages for that (see the discussion here and a code-example with a box-constrained slope).

The first example has a least squares fit that's already got a non-negative slope. All of the methods gave the same fit. The second has a negative least squares slope. Now with the nonlinear least squares model (we have to use trace=TRUE
to see it) the log-slope steps well into the negatives (making the slope effectively zero) before generating a singular gradient
error. The constrained version had no problem, though.
[NB: If the fitted intercept would be negative then NNLS would not be suitable.]
Can we treat it also as a linear least squares problem
You could estimate $\beta_1\beta_2$ with least squares just fine; you just can't estimate them individually (because of the identification issue you mentioned).