Suppose $F$ is a cdf for some random variable on some support, and that $a,b$ are constants with $a<1<b$. I'm hoping to find a distribution such that: $$F \left( \frac{x}{b} \right) - F \left( \frac{x}{a} \right) = \text{constant}$$ i.e. that this difference is independent of $x$.
Intuitively, as you increase $x$, the slope of the CDF has to decrease at just the right speed to constantly outweigh the shrinking size of the interval $\left( \frac{x}{b}, \frac{x}{a} \right)$.
If it helps, in my use-case: $a = \frac{1}{1+c}, b = \frac{1}{1-c}$ with $c\in(0,1)$.
I'm not sure of where to begin here. Is there a way to prove that such a distribution does/does not exist? Is there a well-known distribution that already satisfies this? Is there way to simulate this?
Thank you!