I am working on binary classification with class proportion of 77:23 (977 records)
Currently, I am exploring the feature selection approaches and came across methods like below
a) Featurewiz
b) Sequential forward and backward feature selection
c) Borutapy
d) RFE etc
Now all the above methods use a ML model to find the best performing features.
Now my question is
a) Do we have to use the best parameters for getting the best features?
b) If yes, then once we select the features, do we have to again do a gridsearchCV and find the best parameters to fit and predict?
Or do you think it is suffice to just use default parameters for feature selection and for model building we can use best parameters?