You mention the post from mathstack: https://math.stackexchange.com/questions/1073744/distinguishing-probability-measure-function-and-distribution
The answers there are great and they should be self-sufficient. I recommend anyone to go read it if they are interested in the maths details.
If you are asking here the "same" question, this is probably because you are not familiar with the terminology and the mathematics of probability theory.
What I do not like about the other answers is that they are focusing on the densities instead of what is really important: distributions and measures.
For this reason, here is some vocabulary, aimed at non professional mathematicians. It is a very quick and dirty presentation:
Functional analysis:
Distributions is a mathematical object that you develop in functional analysis and you do not need to know the details here. See https://en.wikipedia.org/wiki/Distribution_(mathematics)
Density is the concept that arises FROM distribution in nice scenarios. When everything is nice, densities are the derivatives of distributions.
Probability Theory:
Cumulative Distribution Function, it is a "normalised" distribution, and for this reason its value over the whole domain is equal to $1$ ( over $ \mathbb R $ if you wish, $ \lim_{x \to \infty } F_X( x ) = 1$),
Probability Density Function, same as above, but for Cumulative Distributions.
Measure Theory:
Random Variable: it is essentially a measurable function. If you do not understand this, skip it, and come back later to this topic. For now consider it as a function. All measurable functions are functions, but all functions are not measurable. However, in most situations, what you could think about is measurable.
Measure: a measure is a function from sets to reals. In other words, it attributes a weight to sets of elements. Measures have to respect some properties that I do not detail here.
Probability Measure: a probability measure is a normalised measure such that the measure of the whole space equals to $1$.
Now that we know the vocabulary, the important theorems you should be aware are:
A random variable is ALWAYS associated to a CDF (cumulative distribution function). It is not possible to have one without the other. If I explicitly define $X$ you can find the CDF of $X$ coined $F_X$ and inversely.
For every CDF, there exists a unique associated probability measure.
What does it mean? it means that if you start with a random variable, you have a CDF, which corresponds to a measure! And inversely!
- A bonus, when the PDF (probability density function) is equivalent to the CDF (in the sense that the derivative of the CDF is equal to the PDF and the anti-derivative of the PDF gives the CDF), then what I said about the CDF (that it uniquely characterises the random variable) is also true for the PDF! Be careful to cases when the PDF and CDF are not equivalent.
The conclusion is that yes, distributions and measures are equivalent. If you have one, you can construct the other. This is great because it simplifies many cases where you can just work out the expression that is easier to work with.