Not quite. The setting is a probability space $(\Omega,\mathfrak{F},\mathbb{P})$ and a measurable function $X$ whose domain is $\Omega$ and whose codomain usually is $\mathbb{R}$ with its Borel sigma-algebra $\mathfrak{B}$ (but generally could be any measurable space).
$X$ induces a probability distribution $\mathbb{P}_X$ as the push-forward of $\mathbb{P}$ via $X$, sometimes written $X_{*}\mathbb{P},$ defined as
$$\mathbb{P}_X(E) = (X_{*})\mathbb{P}(E) = \mathbb{P}(X^{-1}(E)) = \mathbb{P}\left(\{\omega\in\Omega\mid X(\omega)\in E\}\right)$$
for any event $E\in\mathfrak{B}.$
Let's do a simple example. Let $\Omega$ be the set of the three possible ways a flipped coin may land: heads, tails, or on its edge. Let its sigma-algebra $\mathfrak{F}$ consist of all subsets of $\Omega.$ Let the probability distribution $\mathbb{P}$ assign the value $p$ to $\{\text{Heads}\},$ $1-p$ to $\{\text{Tails}\},$ and $0$ to $\{\text{Side}\}.$ This determines $\mathbb P$ on every subset of $\Omega$ according to the laws of probability.
The function $X:\Omega\to\mathbb{R}$ that equals $1$ for $\omega=\text{Heads}$ and otherwise equals $0$ is the indicator of $\text{Heads}.$ $X$ is obviously measurable (because every subset of $\Omega$ is measurable). To figure out what $\mathbb{P}_X$ is, let $E\subset\mathfrak{B}$ be a Borel-measurable set. $X_{*}\mathbb{P}(E)$ is the sum of up to three values: $p$ if $X(\text{Heads})\in E,$ plus $1-p$ if $X(\text{Tails})\in E,$ plus $0$ if $X(\text{Side})\in E$.
One convenient way to express $\mathbb{P}_X$ uses the "one-point" measures $\delta_a$ defined on the Borel sets of $\mathbb{R}.$ These assign the value $1$ to an event $E$ when $a\in E$ and otherwise assign the value $0.$ It's easy to check that they are indeed measures.
The random variable $X$ thereby pushes $\mathbb P$ into the induced measure (or "induced probability function") $$\mathbb{P}_X = (1-p)\delta_0 + p\delta_1.$$
Another description of the induced measure considers only events of the form $E(x)=(-\infty, x]$ for $x\in \mathbb{R},$ because these determine the entire Borel sigma algebra of $\mathbb R.$ The formula
$$F_X: x\to \mathbb{P}_X(E(x)) = \mathbb{P}(X\le x) = \mathbb{P}\left(\{\omega\in\Omega\mid X(\omega)\le x\}\right)$$
defines a function on $\mathbb R,$ the cumulative distribution function of $X.$ It equals $0$ for $x\lt 0,$ jumps up to a constant value of $1-p$ for $0\le x \lt 1,$ and then jumps (by an amount $p$) up to $1$ for $x\ge 1.$
In this example of a Bernoulli$(p)$ random variable, please notice that
$X$ is neither an injection nor a surjection from $\Omega$ to $\mathbb R.$ Its image is merely the set $\{0,1\}.$
$F_X$ is neither an injection nor a surjection from $\Omega$ to the set of possible probabilities $[0,1].$ Its image is the set $\{0,1-p,1\}.$