In Chebyshev's inequality, we can generalize the 68-95-99.7 rule from normal distributions to bound how much density is within a certain number of standard deviations from the mean.
$$ P\big( \big\vert X-\mu \big\vert \ge k\sigma \big)\le\dfrac{1}{k^2} $$
In a multivariate distribution, can we do something similar with Mahalanobis distance substituted for $\sigma$? I would expect the inequality to involve the dimension of the multivariate $X$ random variable and turn into the usual Chebyshev inequality when $X$ is univariate.