I have a fallowing problems: I'm training a neural network against some set of output values (regression problem). Those values are between -inf to inf and I can't normalize them, because they come continuously from a stream of data. Now I'm using MSE, but sometimes it results in very high losses and disturbs training. To avoid it I wanted to somehow limit the value of loss. Thus I came up with the fallowing loss:
$$ L(true, false) = \frac{1}{N}\sum_{i=1}^N{log((true - false)^2+1)}$$
The questions is, if there's a loss like this in the literature, because I can't find anything like that. If not, would it have properties like square loss or something different ?