We have $y_1,\cdots,y_M \in \mathbb{R}^r$ vectors. Suppose that the last entry of each vector $y_i$ is $i$, so that the vectors are sorted by the last entry. I solved the quadratic programming problem:
minimize $|y_M-y_1-w|^2$ subject to $w^t y_{i+1} > w^t y_i$
This is always possible, since the vectors are sorted. Let $w$ be the solution to this problem and $w_r$ be the last entry of $w$. Then it empirically occurred to me that (for large $M$):
1) $w^t y_{i+1} - w^t y_i$ is normally distributed $N(w_r,\sigma^2)$
2) $w^t y_i$ is uniform distributed.
3) $\sigma < w_r$
4) $w_r = M/2$
Now I am asking myself is there is a chance to prove 2) under the condition that 1) & 3) are true. Hence I am considering the following problem:
Let $a < a_1 < a_2 \cdots < a_n < b$ ($a,b$ constant) , $a_i$ be independent random variables such that $a_{i+1}-a_i \sim N(n/2, \sigma^2)$ is normally distributed. Assume that $\sigma << n/2$.
Can one prove or is it true or does somebody know any heuristic, that $a_i$ is uniform distributed over the interval $(a,b)$ ?
One can do an experiment in R for example:
> rn <- rnorm(1000,1000/2,60)
> a <- cumsum(rn)
> b <- runif(length(a),min(a),max(a))
> ks.test(a,b)
Two-sample Kolmogorov-Smirnov test
data: a and b
D = 0.024, p-value = 0.9356
alternative hypothesis: two-sided
This phenomenon seems to happen only if the expected value is much greater than the standard deviation. For example:
> rn <- rnorm(1000,0,1)
> a <- cumsum(rn)
> b <- runif(length(a),min(a),max(a))
> ks.test(a,b)
Two-sample Kolmogorov-Smirnov test
data: a and b
D = 0.142, p-value = 3.499e-09
alternative hypothesis: two-sided
Does anybody have an explanation for this?