No, least squares regression can't be written as a linear program because the squared error is nonlinear w.r.t. the parameters. But, some least squares regression problems can be written as quadratic programs.
Linear programming
Linear programs have the form:
$$\underset{\theta}{\min} \ c^T \theta
\quad \text{s.t.} \
\begin{array}[t]{l}
G \theta \le h \\
A \theta = b
\end{array}$$
where $\theta$ is the parameter vector, and the other vectors (in lower case) and matrices (in upper case) are given. Notice that the objective function and constraints$^1$ are both linear w.r.t. the parameters.
Least squares
In contrast, the ordinary least squares (OLS) regression problem is:
$$\min_\theta \ \sum_{i=1}^n (y_i - x_i^T \theta)^2$$
$$= \min_\theta \ (y - X \theta)^T (y - X \theta)$$
where $\theta$ is the parameter vector, $y = [y_1, \dots, y_n]^T$ is the response vector and $X$ is the design matrix, containing points $\{x_1, \dots, x_n\}$ on the rows.
Notice that this is an unconstrained problem with a quadratic objective function. Linear programming won't work here because it requires a linear objective function, as above. The same argument applies to other forms of regression using the squared error---the objective function is generally nonlinear w.r.t. the parameters.
Quadratic programming
Rather, the OLS problem is a special case of quadratic program. Quadratic programs have a quadratic objective function and linear constraints:
$$\min_\theta \ \theta^T P \theta + q^T \theta
\quad \text{s.t. } \
\begin{array}[t]{l}
G \theta \le h \\
A \theta = b
\end{array}$$
OLS corresponds to setting $P = X^T X$ and $q = -2 X^T y$, and using no constraints$^2$. Certain regularized linear regression problems (e.g. lasso and ridge regression) can also be expressed as quadratic programs. But, not all least squares regression problems can be written this way, because they might involve non-quadratic objective functions$^3$ or nonlinear constraints.
$^1$ Bound constraints ($l \le \theta \le u$) can also be written in the form of linear inequality constraints.
$^2$ To see this for yourself, expand out the OLS objective function and drop constant terms.
$^3$ E.g. consider fitting an exponential curve $\hat{f}(x) = \theta_0 \exp(\theta_1 \cdot x)$ using least squares.