I have read this paper Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning
My question is
What does the arbitrary error functions mean?
I have read this paper Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning
My question is
What does the arbitrary error functions mean?
Arbitrary error (or loss) function is what the name says: arbitrary, i.e. "any" or "some" loss function. There are (infinitely) many loss functions, with mean square error and logistic loss being the most popular ones. Formally, as defined by Christian P. Robert in The Bayesian Choice,
Definition 2.1.1 A loss function is any function $\mathrm{L}$ from $\theta\times\mathcal{D}$ in $[0, +\infty)$.
Loss function is a negative of utility function, so when utility function tells us that something is better than something else, then loss function tells us that something is worse than something else. Utility is something we maximize, loss is something we minimize. What follows, loss function penalizes the incorrectness of the results. "Arbitrary loss function" is just some function used for this purpose.