I am dealing with a regression model where both the DV and IV are log-transformed.
I have found this explanation of how to interpret the effects (both in the Cross-Validated hyperlink and in documents from ULCA and Cornell stats outreach);
Specifically;
Example D: Outcome transformed and exposure transformed log(DV) = Intercept + B1 * log(IV) + Error "One percent increase in IV is associated with a (B1) percent increase in DV."
In my regression the IV is a continuous variable; date each year that an event occurred (e.g., julian date 130).
Let's say my B1 coefficient is 0.5. I'd say a 1% increase in the IV is associated with a 0.5% increase in the DV.
So what is a 1% increase in date? Is that a 1% increase in the mean date? For example if mean date was 130, a 10% increase would be the event occuring on 143, and would result in 5% increase in DV?
What if I had centered date, and now mean date = 0. B1 from the regression is still = 0.5, but obviously it doesn't make sense to now have a 10% increase in mean =0. The use of centered, anomaly type data is common in my field so the example isn't overly contrived.
How can I interpret this effect?
What seems most intuitive to me would be the percent change as a percent of the range of the IV...For example if the range of the IV was 30 days, a 10% change would be 3 days, irrespective of what the absolute mean value is.
Does that make any sense?