As said in previous answer to your question, in Dirichlet-multinomial model we assume Dirichlet prior for $\pi_1, \pi_2, \dots, \pi_k$ parameters of multinomial distribution, what leads to the following model
$$\begin{align}
(x_1, x_2, \dots, x_k) &\sim \mathcal{M}(n, \, \pi_1, \pi_2, \dots, \pi_k) \\
(\pi_1, \pi_2, \dots, \pi_k) &\sim \mathcal{D}(\alpha_1, \alpha_2, \dots, \alpha_k)
\end{align}$$
We estimate the parameters by applying Bayes theorem, and because the two Dirichlet is a conjugate prior for multinomial, we have closed formula for estimating the posterior distribution of $\pi_1, \pi_2, \dots, \pi_k$, that is Dirichlet with posterior parameters $\alpha_1+x_1, \alpha_2+x_2, \dots, \alpha_k+x_k$. If you want to get point estimates for the $\pi_1, \pi_2, \dots, \pi_k$ parameters, you can take mean of the distribution
$$
E(\pi_i) = \frac{\alpha_i + x_i}{\sum_{j=1}^k \alpha_j + x_j}
$$
but you could as well look at other statistics of the distribution, like mode
$$
\mathrm{Mode}(\pi_i) = \frac{\alpha_i + x_i - 1}{\sum_{j=1}^k (\alpha_j + x_j -1)}
$$
Notice that mode is defined for $\alpha_i > 1$, since otherwise mass of the distribution does not accumulate around single peak, as you can see on examples in this answer. Mode of posterior distribution is also known under the name of maximum a posteriori (MAP) estimate.
As a sidenote: The above formulas are very simple, but in many cases (not here) estimating the full posterior distribution is a hard problem. Often finding point estimate using MAP is much easier, since you can find MAP by optimization, rather then MCMC simulation for the full distribution. What follows, sometimes people directly estimate the mode (point estimate), without finding the full distribution.
You can be interested in other statistics of the posterior distribution as well (median, quantiles, etc.), depending on your needs.