With probability one, the realizations of a Dirichlet Process are discrete probability measures. A rigorous proof can be found in
Blackwell, D. (1973). "Discreteness of Ferguson Selections", The Annals of Statistics, 1(2): 356–358.
The stick breaking representation of the Dirichlet Process makes this property transparent.
Draw independent $B_i\sim\mathrm{Beta}(1,c)$, for $i\geq 1$.
Define $P_1=B_1$ and $P_i=B_i \prod_{j=1}^{i-1}(1-B_j)$, for $i>1$.
Draw independent $Y_i\sim F$, for $i\geq 1$.
Sethuraman proved that the discrete distribution function
$$
G(t,\omega)=\sum_{i=1}^\infty P_i(\omega) I_{[Y_i(\omega),\infty)}(t)
$$
is a realization of a Dirichlet Process with concentration parameter $c$ and centered at the distribution function $F$.
The expectation of this Dirichlet Processs is simply $F$, and this may be the distribution function of a continuous random variable. But, if random variables $X_1,\dots,X_n$ form a random sample from this Dirichlet Process, the posterior expectation is a probability measure that puts positive mass on each sample point.
Regarding the original question, you can see that the plain Dirichlet Process may be unsuitable to model some problems of Bayesian nonparametrics, like the problem of Bayesian density estimation, but suitable extensions of the Dirichlet Process are available to handle these cases.