0

I read this post about feature scaling: all-about-feature-scaling

The two main feature scaling techniques are:

min-max scaler - which responds well for features with distributions which are not Gaussian.

Standard scaler - which responds well for features with Gaussian distributions.

I read other posts and examples, and it seems that we always use one scaling method (min-max or standard) for all the features.

I haven't seen example or paper that suggests:

1. go over all the features, and for each feature:
1.1 check feature distribution
1.2 if the feature distribution is Gaussian:
1.2.1 use Standard scaler for this feature
1.3 otherwise:
1.3.1 use min-max scaler for this feature
  1. Why we are not mixing the scaling methods ?

  2. What is wrong or disadvantages with my proposal ?

user3668129
  • 115
  • 3
  • Why do you want to scale the features? You ought to have some reason ... – kjetil b halvorsen May 01 '20 at 02:55
  • each feature has it's own scale., and from reading at the internet it seems require reprocessing step. am I missing something ? – user3668129 May 01 '20 at 03:27
  • It is a misconception that feature scaling is always needed! That depends on specific models used, and modeling goals. See https://stats.stackexchange.com/questions/201909/when-to-normalize-data-in-regression/202002#202002 – kjetil b halvorsen May 01 '20 at 18:14

0 Answers0