For convenience, let $X$ denote a continuous zero-mean random variable with
density function $f(x)$, and consider $P\{X \geq a\}$ where $a > 0$. We have
$$P\{X \geq a\} = \int_a^{\infty}f(x)\,\mathrm dx
= \int_{-\infty}^{\infty}g(x)f(x)\,\mathrm dx = E[g(X)]$$
where $g(x) = \mathbf 1_{[a,\infty)}$. If $n$ is an even integer and $b$ any
positive real number, then
$$h(x) = \left(\frac{x+b}{a+b}\right)^n \geq g(x), -\infty < x < \infty,$$
and so
$$E[h(X)] = \int_{-\infty}^{\infty} h(x)f(x)\,\mathrm dx
\geq \int_{-\infty}^{\infty}g(x)f(x)\,\mathrm dx = E[g(X)].$$
Thus we have that for all positive real numbers $a$ and $b$,
$$P\{X \geq a\} \leq E\left[\left(\frac{X+b}{a+b}\right)^n\right]
= (a+b)^{-n}E[(X+b)^n]\tag{1}$$
where the rightmost expectation in $(1)$ is the $n$-th moment
($n$ even) of $X$ about $-b$. When $n = 2$, the smallest upper bound on
$P\{X \geq a\}$ is obtained when $b = \sigma^2/a$ giving the
one-sided Chebyshev inequality (or Chebyshev-Cantelli inequality):
$$P\{X \geq a\} \leq \frac{\sigma^2}{a^2 + \sigma^2}.$$
For larger values of $n$, minimization with respect to $b$ is messier.