Questions tagged [inequality]

Use this tag if you question involves the use of an inequality. The inequality may have probabilistic origins or be a purely mathematical inequality. Do not use for measures of inequality, for instance income inequality. For that use the [diversity] tag.

Examples of questions requiring the use of inequalities might be using Markov inequality, Chebyshev, Chernoff, Efron-Stein, Azuma, Kolmogorov, Holder, Jensen, etc.

118 questions
17
votes
1 answer

Oracle Inequality : In basic terms

I'm going through a paper that uses oracle inequality to prove something but I'm unable to understand what it is even trying to do. When I searched online about 'Oracle Inequality', some sources directed me to the article "Candes, Emmanuel J.…
9
votes
2 answers

Proof that variance is always greater than or equal to zero

It is common knowledge that: $$\begin{equation}\label{3} Var(X) \geq 0 \end{equation}$$ for every random variable $X$. Despite this, I do not remember seeing a formal proof of this. Is there a proof of the above inequality? What if we include the…
8
votes
2 answers

Bounds on Cov(X, Y) given Var(X), Var(Y)?

I'm generating random multivariate normal data using the rmultnorm() function in R, which allows users to specify a vector of $k$ population means and a $k \times k$ variance-covariance matrix. Given $\newcommand{\Var}{\mathrm{Var}}\Var(X)$, and…
RobertF
  • 4,380
  • 6
  • 29
  • 46
7
votes
1 answer

Prove that Kurtosis is at least one more than the square of the skewness

Wikipedia claims it, and on reading the paper that it linked I found that the proof that was written there was quite difficult. Is there a simple proof possible for this identity? The proof given in the linked paper there is as follows:
Martund
  • 431
  • 2
  • 8
7
votes
0 answers

Inequalities on Fisher Information / expected second derivative?

Under some regularity conditions we can compute fisher information as $ - \mathbb{E}_{\theta_0} [\frac{\partial}{\partial \theta^2} \ln f(x;\theta_0)] $ I was wondering if there are some kind of inequality results including the above expecation…
7
votes
1 answer

Lower Bound on $E[\frac{1}{X}]$ for positive symmetric distribution

Let $X$ be positive random variable and its distribution is symmetric about its mean value $m$. Then $$ E\left[\frac{1}{X}\right] \geq \frac{1}{m} + \frac{\sigma^2}{m^3}, $$ where $\sigma^2$ is variance of $X$. I can just prove that $$…
Ethan
  • 463
  • 3
  • 7
6
votes
2 answers

What is the correlation between a random variable and its probability integral transform?

Are there known bounds on the $\operatorname{cor}(X,F(X))$? $X$ is a random variable with CDF $F(X)$. Let $X$ have a fixed variance, for example $\operatorname{var}(X)=1$. What $X$ can maximize or minimize the covariance?
sayda
  • 87
  • 4
6
votes
1 answer

Bounding residual variance with distance from mean

For a linear regression $Y = X\beta + \varepsilon$ with $\varepsilon \sim \mathcal N(0,\sigma^2 I)$, we have $\hat Y = H Y$ for $H = X(X^TX)^{-1}X^T$. This means that $Var(Y - \hat Y) = \sigma^2(I-H)$ so in particular $Var(Y_i - \hat Y_i) =…
alfalfa
  • 581
  • 3
  • 13
6
votes
1 answer

KL divergence bounds square of L1 norm

In Cover & Thomas, Elements of Information Theory, at the section on Conditional Limit Theorem (11.6), it is proved that the KL divergence bounds the $\cal{L}_1$-norm from above, $\frac{1}{2\ln2}\|p_1-p_2\|_1^2\leq D[p_1||p_2]$ The presented proof…
Shlomi A
  • 175
  • 7
5
votes
2 answers

How to calculate lower bound on $P \left[|Y| > \frac{|\lambda|}{2} \right]$?

Let $Y$ be a random variable such that $E[Y] = \lambda$, $\lambda \in \mathbb{R}$ and $E[Y^2]<\infty$. The problem is to find a lower bound on the probability $$ P \left[|Y| > \frac{|\lambda|}{2} \right]. $$ Any leads would be appreciated.
Bhisham
  • 61
  • 4
5
votes
3 answers

Is $P(|X_1|>k)\le P(|X_2|> k)$ when $X_i\sim N(\mu_i,\sigma^2)$ and $|\mu_2| \ge |\mu_1|$?

Suppose $X_1\sim N(\mu_1,\sigma^2)$ and $X_2\sim N(\mu_2,\sigma^2)$ where $\mu_2\ge \mu_1$. Since $\mu_2\ge \mu_1$, based on a characterization of stochastic ordering, we can say that $$P(X_1>c)\le P(X_2> c) \quad \text{ for any constant }c\,.$$ Now…
StubbornAtom
  • 8,662
  • 1
  • 21
  • 67
5
votes
1 answer

Difference of two KL-divergence

The Kullback-Leibler (KL) divergence between two distributions $P$ and $Q$ is defined as $$\mbox{KL}(P \| Q) = \mathbb{E}_P\left[\ln \frac{\mbox{d}P}{\mbox{d}Q}\right].$$ My question is that suppose there are three distributions $P, Q, R$, is it…
5
votes
2 answers

How can I establish an inequality between $|\frac1n \sum_{i=1}^nX_i|$ and $\frac1n\sum^n_{i=1}|X_i|$ where $X_i \sim N(0,1)$?

Let $X_1, \ldots , X_n$ be a random sample from a $N(0,1)$ population. Define $Y_1=|\frac1n \sum_{i=1}^nX_i|$ and $Y_2=\frac1n\sum^n_{i=1}|X_i|$. Find a relationship between $E(Y_1)$ and $E(Y_2)$. I have a feeling that I will need to use Jensen's…
4
votes
1 answer

Hoeffding type concentration result for the inverse of a sum of iid random variables

Consider a collection of $n$ i.i.d. Bernoulli random variables $\{ X_i \}_{i=1}^{n}$ with $\mathbb{E}[X_i] = \mu$. Then, if $\hat{\mu}$ is the mean of the $n$ random variables, i.e. if, $$\hat{\mu} = \frac{1}{n} \sum_{i=1}^{n} X_i,$$ then, by…
4
votes
1 answer

Upper bound for absolute third central moment

Suppose $X\in \mathbb{R}$ is a random variable with expected value $\mathbb{E}X = \mu$. I ran across a proof which uses the inequality $$ \mathbb{E}[|X - \mu|^3] \leq 2^3 \mathbb{E}|X|^3. $$ Can anyone help me understand where this inequality is…
1
2 3 4 5 6 7 8