I am running several random forest regression models on different datasets. In each, I have a continuous DV and ~30 dichotomous predictors. I don't expect these predictors to explain much variance. What I am really interested in is which ones are related to the dependent variable.
In some datasets, the model predicts ~5% of the variance, which is about what I would expect. But in others, it is < 1% and is sometimes negative.
This made me wonder if there is a certain minimum explained variance, below which the importance of predictors in a model shouldn't be interpreted?