I wanted to share another reference. It's a "comment" paper, and actually part of an argument between two economists, Smith and Campbell, and a statistician, Marquardt. But it's relevant.
Link to the Marquardt JASA article. Sorry to refer you to a non open-access article, but I highly recommend the read if you get a chance.
Marquardt is strongly arguing in favor of centering and scaling for various reasons, one being that you have prior knowledge about the absolute size of these variables (they shouldn't be much bigger than 3, based on reasoning using Chebychev's inequality), and this is a justification for "shrinkage" in techniques like ridge regression.
And of course there's the interpretation issue as well. He draws a cool picture of a parabola fit to data. Almost all the data is on the right side of the parabola, so that it's almost linear in a positive direction. But the coefficient on the linear term is negative! (Remember there's a squared term in there.)
Now a parabola $ax^2 + bx + c$ is really just like having a term interact with itself. That same phenomenon would also happen with interactions, where the main effect would have a nonsense value because you never see the interacting value at zero. Of course, some people you shouldn't try to estimate main effects when there are interactions.
To sum it up, terms with interactions and polynomial terms are where you'll get the most out of your effort. If you're going to apply shrinkage or a Bayesian prior that says parameter estimates should be small, maybe you should standardize all your numeric variables.