I am doing a ordinary least squares fit. I have data which was recorded at different times. The fit could depend on time so I want to trust my older data less. The obvious way to do this is to weight the data based on how old each data point is and use a weighted least squares fit.
For example, suppose I am trying to fit a linear relationship to estimate the number of people inside McDonald's based on the number of cars in the parking lot. I have [people,cars] data points from many location for the last year recorded a sporadic times. The most recent data should be more useful to predict this relationship as it may change over time but I do not want to throw out older at some arbitrary age.
Is there a standard for this sort of issue or do I need to make some assumption about how the quality of my data ages? Is there a better way to conceptualize this problem?
EDIT: It is quite possible that there is a trend in the dependant variable as the age changes. In fact the error of each measurement should not change. It might be a better idea to fit that dimension as well and take it out after. In the above example this would be to fit the number of customers inside as a function of cars outside and how long ago the measurement was taken. Then use the current time for predictions.