If you have the joint distribution of $(X,Y)$ at hand, then by definition
\begin{align}
P(X>Y)&=E[I_{X>Y}]
\\&=\begin{cases}\sum\sum_{i>j}P(X=i,Y=j)&,\small\text{ if } (X,Y)\text{ is discrete having a pmf}
\\\iint_{x>y}f_{X,Y}(x,y)\,dx\,dy&,\small\text{ if }(X,Y)\text{ is absolutely continuous with pdf }f_{X,Y}\end{cases}
\end{align}
This definition is not valid for mixed random variables or if $(X,Y)$ does not have a pmf/pdf.
Assumption of independence and/or identical distributions of $X$ and $Y$ make further simplifications possible. With independence, joint pmf (pdf) factorizes as product of the marginal pmfs (pdfs). In particular, if $X$ and $Y$ are independent and identically distributed continuous random variables, then $P(X>Y)=1/2$ as mentioned in the comments.
And of course since $P(X>Y)=P(X-Y>0)$, if you can find the distribution of $X-Y$ from the joint distribution of $X$ and $Y$, then this probability can be calculated.