Although it is clear to me, how the two concepts differs, it has been difficult for me to find a notation that would make it clear, to which type of entropy we refer.
From wikipedia, we can see that joint entropy of two random variables are defined in terms of the probability of the possible outcomes: https://en.wikipedia.org/wiki/Joint_entropy
However, for the slightly more subtle definition of cross entropy, which is not symmetric, uses the concept of ditribution, but the notation is the same: https://en.wikipedia.org/wiki/Cross_entropy
This "misleading" notation for cross entropy is also shown on the wikipedia page for Kullback-liebler definition: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence
To me, it looks like $H(X,Y)$ is fine for defining joint entropy, but what should I use then for cross entropy ?