It is well known in risk theory that a random variable having a Decreasing Failure rate implies the Random Variable has an increasing mean residual lifetime. However, the converse is not true. Is there a counterexample?
A solution to a similar question regarding increasing failure rate and decreasing mean residual lifetime is appreciated