A danger in having powerful tools available in standard software programs is that users don't always understand the underlying hidden assumptions.
Even if you will be using Stata for routine work, I recommend getting a copy of An Introduction to Statistical Learning and working through the examples in Chapter 6 of LASSO and ridge regression, with the code provided in R. That will take you through the steps that are involved in building a penalized regression model.
What's hidden in the Stata function call is that it automatically does cross-validation to find the "best" penalty for the regression coefficients. In principle any penalty value could be used. Based on the output you posted, Stata evidently does cross-validation for the optimization, finding the penalty that provides the lowest mean cross-validated deviance. This is not the only possible criterion; those who want a lower number of included predictors instead might choose the penalty that provides the smallest model providing performance within 1 standard error of that minimum.
If you are happy with the minimum cross-validated deviance criterion, then you would use the penalized coefficient values returned by Model 42 for a prediction model.
The issue of inference (p values, confidence intervals, etc.) with predictor-selection approaches like LASSO is difficult. Once you have used the data to help select the predictors, the assumptions underlying standard formulas no longer hold. For example, you could not simply take the 3 predictors kept in Model 42, run a standard linear regression with them, and trust the p values that are reported. There are some recent approaches to taking the predictor selection into account. If you want to pursue those issues, start with a careful reading of Statistical Learning with Sparsity.
Ridge regression, which keeps all predictors in the model but penalizes their coefficients to minimize over-fitting, might provide some advantages over LASSO in your case. See this page for an introduction to issues with respect to p values and such with ridge.