Questions tagged [tuning]

30 questions
6
votes
1 answer

Is there a hard distinction between hyperparameter vs parameter in machine learning?

I was watching Andrew Ng's lecture on the difference between parameter vs hyperparameter, https://www.youtube.com/watch?v=VTE2KlfoO3Q&ab_channel=Deeplearning.ai, and a question came to me. Is there really that much of a distinction between…
Olórin
  • 674
  • 5
  • 16
3
votes
1 answer

Model tuning in the presence of incorrect training labels

I have a situation where I have a large amount of labeled data (~40 million records) with a binary outcome variable that has about 50% positive and 50% negative cases. The issue is that I know that the true proportion for these 40 million is more…
astel
  • 1,388
  • 5
  • 17
3
votes
2 answers

Cross-validation for (hyper)parameter tuning to be performed in validation set or training set?

I am learning about the use of cross-validation with grid-search to choose the best hyperparameter for SVM. The problem I came across is the references and examples of its application do not follow a singular standard. On the one day, I have seen…
2
votes
0 answers

Using hyperopt for finding n_estimators for ensemble model?

I am trying to tune my model (An ensemble model with xgboost, lightbgm and catboost) with hyperopt, but I don't know how to find the optimal n_estimators value. When I use the best_parameters returned by fmin to train the model, apparently I have to…
2
votes
2 answers

Range of Values for Hyperparameter Fine-Tuning in Random Forest Classification

I have implemented a random forest classifier. At the moment, I am thinking about how to tune the hyperparameters of the random forest. Of course, I am doing a gridsearch type of algorithm while checking CV errors. The problem is that I have no clue…
2
votes
1 answer

Tuning hyperparameters with simulated data, do I need to use cross-validation or can I just give it simulated data sets from different seeds?

I am doing a method comparison of some machine learning models across certain scenarios. I simulated data where associations are known. To me, this seems like a simple way to have as much data as I want to train, tune, and test models (over and…
2
votes
0 answers

How many hyperparameters should be optimized in machine learning?

I am using neural network algorithms for a relatively large dataset with 1700 obs and 40 features. I performed optimizing by nested cross validation. I also wanted to compare 5 algorithm with each other by benchmarking. When I select about 5…
Killbill
  • 177
  • 4
2
votes
1 answer

Need to do hyperparamter tuning for new features?

Suppose I have a set of features(say 100 features) and spent a lot of time doing hyperparameter tuning to get a good model. Now I have a few new features(say less than 5 new features) added into the training set, should I re-do hyparameter tuning…
2
votes
0 answers

High dimensional hyperparameter tune

Many already known optimization techniques rely on past data (Bayesian optimization for instance) and perform really well for a bunch of hyperparameters. Is there, however, a good tuner/tuning method that does well with thousands of hyperparameters…
player1
  • 21
  • 1
2
votes
2 answers

Tuning hyperparameters never affects weights?

I am trying to better understand “tuning the hyperparameters”. I understand how to use GridSearchCV, I found the below explanation useful: “As we do not know whether those parameters affect each other, doing it right will require that we train a…
Chicago1988
  • 649
  • 4
  • 16
2
votes
1 answer

Perceptual Loss Layers Selection

I understand that in order to improve your generative model performance it is quite useful to compare your output and the target in the feature space, as stated in the paper Perceptual Losses for Real-Time Style Transfer and Super-Resolution, in…
1
vote
0 answers

How many epochs should I iterate for tuning an ANN?

I have been using Keras-Tuner for tuning my ANN before going into training. The tuner seems to be iterating forever even though I set a limit of 1000 epochs. After that, I have decided to terminate the tuning process after 500 epochs. However, I am…
user366312
  • 1,464
  • 3
  • 14
  • 34
1
vote
0 answers

Tuning parameters for multiple regression models

I am trying to compare multiple regression algorithms to estimate biomass (dependant variable) : KNeighborsRegressor, GaussianProcessRegressor, LinearRegression, BayesianRidge, Ridge, SGDRegressor, PassiveAggressiveRegressor, DecisionTreeRegressor,…
1
vote
0 answers

xgboost hyperparameters: interactions that make the model overfit on training set

I am dealing with a classification problem on an unbalanced dataset (positive class is just above 1% of the sample). I did hyperparameter tuning using a train-validation split, and then finally trained the model and checked my metrics of an interest…
1
vote
0 answers

Relationship between learning_rate and n_estimators (LightGBM) to improve hyperparameter tuning speed

I am running a Hyperopt search over a LightGBM regressor on a large dataset to tune max_depth, min_child_samples, reg_lambda and learning_rate, with n_estimators static at 10000 and num_leaves dynamically set to (2^max_depth)-1. It works, but it is…
1
2