1

It is well known in risk theory that a random variable having a Decreasing Failure rate implies the Random Variable has an increasing mean residual lifetime. However, the converse is not true. Is there a counterexample?

A solution to a similar question regarding increasing failure rate and decreasing mean residual lifetime is appreciated

kjetil b halvorsen
  • 63,378
  • 26
  • 142
  • 467
Preston Lui
  • 509
  • 2
  • 8

0 Answers0