The lectures statistics I followed also presented the Poisson distribution. We were taught that the number of events occurring in a time interval, that this statistic follows a Poisson distribution.
$ \begin{split} P(events\; in\; time\; interval) = e^{-\lambda} \frac{\lambda^k}{k!} \end{split} $
The context of this stochastic process is that of queuing - events that occur at random moments and have to wait before being processed.
Initial question: What is the deeper causal-stochastic reason why the Poisson distribution naturally occurs? An elaborate derivation of the stochastics behind Poisson processes can be found here.
But what happens when the assumptions of homogeneity and independence do not hold, across the timeline? Visits to many commercial websites tend to be centered around noon/afternoon. The number of visits does not follow one Poisson distribution over the day. The one-parameter Poisson distribution is too simple for these real-world scenarios.
Just a note: The log-normal distribution, for example, has the two statistics mean and standard deviation, whereas the Poisson distribution has only its mean $\lambda$.
I'm am aware that I push towards a fundament in statistics which has its historic roots more than a century ago. Thanks for any comments and answers.