3

I have a family of random variables $y(x)$ and a set of data points $(x_i, y_i)$. I know that, for each $x$, $y(x)$ is exponentially distributed. I have a hypothesis that the mean of the distribution depends on $x$, i.e. that $y(x)$ is distributed exponentially with the mean $f(x)$.

My current hypothesis about $f(x)$ is that it is equal to $a e^{b (x - c)^2} + d$, for some values of the parameters $a$, $b$, $c$, and $d$, which I would like to estimate using some sort of regression.

From what I know, the standard approach to fitting a curve is to use the least-squares method. However, it assumes normally distributed data, whilst in my case, the data is distributed exponentially.

I found some information about generalized linear models, which allow to avoid the normality assumption, but in my case, $f(x)$ is non-linear.

In textbooks, I saw that the least-squares method is treated as a special case of maximum likelihood estimation. However, I was not able to find any evidence of people using maximum likelihood for curve-fitting purposes in a general case or for exponentially distributed data in particular.

So, my question is: how should I approach this problem? Is there a common method for fitting a non-linear curve with exponentially distributed data points?

  • Emulating the approach I described at https://stats.stackexchange.com/a/64039/919 might solve your problem. – whuber Jun 05 '20 at 13:11

0 Answers0