2

As part of the data pre-processing in the paper Cloud Security: LKM and Optimal Fuzzy System for Intrusion Detection in Cloud Environment numeric attributes were normalized using the following equation. enter image description here

Where:

  • Xij is a value to be normalised
  • Mi is the mean value of the feature
  • Max, Min are maximum and minimum values of the feature respectively.

From what I understand, there is a difference between standardization and scaling, and the equation before represents neither. Standardization as I know requires dividing by the standard deviation. while Min-Max scaling requires subtracting the minimum value before diving by the value range. As explained here

As I couldn't find any source mentioning the previous equation, I don't understand if it is simply wrong or it has some specific advantages

  • Sadly, I think the words "normalized" and "scaled" are used in non-standard ways by a lot of authors. – StatsStudent Aug 30 '20 at 14:13
  • 1
    There actually is no consensus on the definitions of the three terms (standardize vs scale vs normalize) and when to use one or the other. My own habit of words would be: standardizing is to take to zero mean and unit variance; scaling is to take to unit sum-of-squares (mostly); and normalizing can be of various nature. – ttnphns Aug 30 '20 at 14:46

0 Answers0