For my reference and anyone else who might want it:
The best general answer results from whuber's referred post, showing that Gaussian marginals do not generally combine to form a Gaussian joint. This shows that the answer to my posed question is no, in general (not just if variables are discontinuous, as demonstrated by svendvn).
The case where marginal Gaussians do form a Gaussian joint is when the variables are independent or are independent under some linear transformation: as whuber mentioned, any multivariate Gaussian (MVG) is equivalent under linear transformation to a MVG with zero covariances (*see below for explanation).
Another characterization is that the collection of rvs $X_1, \dots, X_n$ is jointly Gaussian iff for any $a_1, \dots, a_n$, the linear combination $\sum_i a_i X_i$ is Gaussian (see here).
*If we have a multivariate $X\sim \mathcal{N}(\mu, \Sigma)$, then a linear transformation $A$ gives $AX \sim \mathcal{N}(A\mu, A\Sigma A^\top)$. Since any covariance matrix $C$ is symmetric, it has the diagonalization $C = Q\Lambda Q^\top$ where $\Lambda$ is diagonal and $Q$ is an orthogonal transformation. This means that our $X$ can be transformed orthogonally to a coordinate system where it is a joint distribution of independent Gaussian marginals.
So in summary, it is necessary to define a GP as having joint Gaussian rather than joint marginals, and an alternate definition along this second route would be require an additional requirement of independence, i.e. "a stochastic process $X_t$ is a GP if for every $t$, $X_t$ is Gaussian distributed and every linear combination of a subset $(X_{t_1}, \dots, X_{t_n})$ is Gaussian.
Note: I'm still not sure how to prove that a set of variables is independent under some linear transformation