The answer from @zxyue is fantastic.
However, it omitted the process how the 1st equation $\mathbb{E}[\ln\mu_j]=\int_0^1 \ln \mu_j \text{Dir}(\boldsymbol{\mu}|\boldsymbol{\alpha}) d\mu_j$ is achieved. This answer is an supplementary answer to @zxyue's answer.
According to the definition of Expectation: $\mathbb{E}[X]=\int_\mathbb{R}xf(x)dx$. Then
$$
\begin{align}
\mathbb{E}[\ln\mu_j]
&=\int\ln\mu_jf(\boldsymbol{\mu})d\boldsymbol{\mu}\\
&=\int\dots\int\ln\mu_jf(\boldsymbol{\mu})d\mu_1\dots d\mu_K\\
&=\int\dots\int\int\int\dots\int\ln\mu_jf(\boldsymbol{\mu})d\mu_1\dots d\mu_{j-1}d\mu_jd\mu_{j+1}\dots d\mu_K
\end{align}
$$
Move integral $\int\dots d\mu_j$ to the most outside and set its interval as [0,1]
$$
\mathbb{E}[\ln\mu_j]
=\int_0^1\int_0^{1-\sum_{k=1}^{\mathbb{K}-1}{\mu_k}}\dots\int_0^{1-\sum_{k=1}^{j}{\mu_k}}\int_0^{1-\mu_j-\sum_{k=1}^{j-2}\mu_k}\dots\int_0^{1-\mu_j}\ln\mu_jf(\boldsymbol{\mu})d\mu_1\dots d\mu_{j-1}d\mu_{j+1}\dots d\mu_Kd\mu_j
$$
Move $ln\mu_j$ out of the core integral then to the outermost integral, since $ln\mu_j$ is independent of $\mu_1\dots\mu_{j-1},\mu_{j+1}\dots\mu_K$ thus could be treated as a constant
$$
\begin{align}
\mathbb{E}[\ln\mu_j]
&=\int_0^1\ln\mu_j\int_0^{1-\sum_{k=1}^{\mathbb{K}-1}{\mu_k}}\dots\int_0^{1-\sum_{k=1}^{j}{\mu_k}}\int_0^{1-\mu_j-\sum_{k=1}^{j-2}\mu_k}\dots\int_0^{1-\mu_j}f(\boldsymbol{\mu})d\mu_1\dots d\mu_{j-1}d\mu_{j+1}\dots d\mu_Kd\mu_j\\
&=\int_0^1\ln\mu_jg(\boldsymbol{\mu})d\mu_j
\end{align}
$$
Where $g(\boldsymbol{\mu})$ is $\int_0^{1-\sum_{k=1}^{\mathbb{K}-1}{\mu_k}}\dots\int_0^{1-\sum_{k=1}^{j}{\mu_k}}\int_0^{1-\mu_j-\sum_{k=1}^{j-2}\mu_k}\dots\int_0^{1-\mu_j}f(\boldsymbol{\mu})d\mu_1\dots d\mu_{j-1}d\mu_{j+1}\dots d\mu_K$
By observation, we could easily realize that $g(\boldsymbol{\mu})$ is the marginal distribution of variable ${M_j}$. That is to say
$$
\begin{align}
g(\boldsymbol{\mu})&=\int_0^{1-\sum_{k=1}^{\mathbb{K}-1}{\mu_k}}\dots\int_0^{1-\sum_{k=1}^{j}{\mu_k}}\int_0^{1-\mu_j-\sum_{k=1}^{j-2}\mu_k}\dots\int_0^{1-\mu_j}f(\boldsymbol{\mu})d\mu_1\dots d\mu_{j-1}d\mu_{j+1}\dots d\mu_K\\
&=f_{M_j}(\mu_j)
\end{align}
$$
As we known, $\boldsymbol{\mu}$ follows Dirichlet Distribution $\text{Dir}(\boldsymbol{\mu}|\boldsymbol{\alpha})$. Dirichlet Distribution is a multivariate version of Beta Distribution. Intuitively, we could get that $f_{M_j}(\mu_j)$ is the probability density function of Beta Distribution $\text{Beta}(\alpha_j, \alpha_0 - \alpha_j)$, where $\alpha_0=\sum_{j=1}^{K}\alpha_j$. Then we could write $\mathbb{E}[\ln\mu_j]$ as follows
$$
\begin{align}
\mathbb{E}[\ln\mu_j]
&=\int\ln\mu_jf(\boldsymbol{\mu})d\boldsymbol{\mu}\\
&= \int_0^1\ln\mu_jg(\boldsymbol{\mu})d\mu_j\\
&= \int_0^1\ln\mu_jf_{M_j}(\mu_j)d\mu_j\\
&= \int_0^1 \ln \mu_j \text{Beta}(\alpha_j, \alpha_0 - \alpha_j) d\mu_j
\end{align}
$$
Which is now consistent with the 2nd equation from @zxyue's answer.
For a formal derivation of the marginal distribution of Dirichlet distribution, please refer the answer from question Find marginal distribution of -variate Dirichlet