I am interested in quantifying the dispersion of pixel values in one image. More specifically, I want to get an average value of dispersion of the image column-wise (i.e., compute the dispersion of each column of the image and then "average" these values).
Currently, I am using two different metrics: SD and IQR. Regarding the former, the proper way to do so is to average the variances of each column and then take the square root. Is there such particular consideration that should be taken into account for IQR or is it valid to just take the average of the IQR values?