The 2nd edition of James et al. "An Introduction to Statistical Learning" (2021) contains a new chapter on survival analysis and censored data (Chapter 11). Section 11.6 discusses shrinkage for the Cox model. When it comes to selecting optimal shrinkage intensity $\lambda$ in regression or classification tasks, one often uses cross validation. To that end, one needs to measure the model's performance on a left-out fold. The text says this is nontrivial for the Cox model due to some observations being censored. Some guidelines are provided as to how model fit could be assessed nonnumerically. However, I do not see how exactly that would help in selecting optimal regularization intensity in a regularized Cox model. Thus my questions:
- How to assess numerically out-of-sample performance of a (regularized) Cox model?
- How to select optimal regularization intensity in a regularized Cox model?
(Answering 1. helps answer 2., but perhaps 2. can be answered in a different way altogether.)