I have a series of vectors, which are measurements from one sample at two time points.
First time point:
1.[5,3,2,4,4,3,6,5]
(mean=4.000)
2.[5,6,3,3,4,3,4,5]
(mean=4.125)
3.[6,3,4,2,5,3,5,7]
(mean=4.375)
... etc.
grand mean=4.17
Second time point:
1.[1,2,1,2,1,3,4,5]
(mean=2.375)
2.[2,2,3,1,1,3,3,5]
(mean=2.500)
3.[1,3,1,2,2,3,5,4]
(mean=2.625)
.... etc.
grand mean=2.5
I want to see if the variance for each measurement/vector is significantly different between the two time points.
However, the second measurement has a lower mean, which can therefore drive overall variance. How do you compare the variance of two conditions when the means differ?