A common practice to sum variables that are on different scales is to standardize them, by subtracting the mean and dividing by the standard deviation. However, this produces negative values, which is sometimes a problem (e.g., here).
I am wondering if subtracting the minimum (instead of the mean) would still make sense, while avoiding negative values, like in the following formula: $$ Index_{i} = \frac{x_{i}-\min_{x}}{SD_{x}} + \frac{y_{i}-\min_{y}}{SD_{y}} + \frac{z_{i}-\min_{z}}{SD_{z}} $$ with variables $x,y,z \geq 0$
Would this be a problem in terms of scale or not? I mean, is there a problem if I sum 3 variables normalized in this way? Variance should be taken into account as I divide by SD. But does subtracting the minimum, instead of the mean, produce range problems or other issues?
Would this bias results for the analysis of time series in comparison to classic standardization $ \frac{x_{i}-\mu_{x}}{SD_{x}} $? Take the case in which I measure this indicator ($Index_{i}$), for two persons, every week. However, each week the reference sample changes, with different mean, min and SD.
EDIT
Let me add that summing these variables makes sense in our previous experience, as they are correlated and underlie the same construct (even if they measure different things). Summing them (having subtracted the mean and divided by SD) is something that works well. We already tested it. But negative values are a problem. This is the reason why of my question on the min. Assuming that a sum of standardized values is ok, would it still be ok if I change the mean with the min?