I have a dataset in excel that looks like this:
Impressions Clicks CTR
2 2 1
5 4 0.8
14 5 0.357142857
2481574 2946 0.00118715
4835785 7453 0.001541218
6224189 10903 0.001751714
5364316 7595 0.001415838
4445746 7003 0.001575214
3789942 3881 0.001024026
3015085 3290 0.00109118
2795202 2823 0.001009945
2727213 2976 0.001091224
2770730 3318 0.001197518
2770752 3932 0.001419109
2467906 3515 0.001424284
1842635 3470 0.001883173
1714929 3833 0.002235078
1642027 3808 0.002319085
1688948 4123 0.002441165
2210116 5298 0.002397159
1758368 4485 0.002550661
48611 55 0.001131431
1 0 0
1 0 0
1 0 0
1 0 0
1 0 0
5 0 0
2 0 0
1 0 0
I want to determine the weighted standard deviation of the CTR column In Excel. For example there are examples of CTR being 0 but that's because the number of impressions is only 1 so it really doesn't justify the variance let alone the mean.
Does anyone know a formula in excel that can do this? I am looking specifically for an Excel formula not just a mathematical formula.